“There is no 

reason to think

that evolution 

selected for 

truth...  Truth 

is a human 

concept.”
 
 

 

 

What We Believe ... And Why
Validating Our World View
 By Jeanine DeNoma 

This article is based on James Alcock’s talk “The Psychology of Belief,” given at a Eugene workshop on human error. Alcock is a social psychologist at York University in Toronto, Canada and a practicing clinical psychologist. He is a member of CSICOP’s Executive Council and an expert on the psychology of belief.

A recent magazine reported the development of a small computer that can replace the bar code on products. The chip allows a whole shopping cart of groceries to be pushed at normal speed past the cash register and the purchases to be rung up without handling.

     A Montana man removes a small strip from dollar bills. Scoffing at the suggestion they are there to stop forgery, he claims they contain a chip which can send out and receive signals. This signal allows authorities using patrol cars or satellites to determine how much money you have.

     There is nothing impossible about either scenario. Why does the second seem so much more ridiculous to us than the other? asked James Alcock, speaking at the CSICOP workshop on human error. Most of us would wonder who cares how much money we carry, but to individuals with a slightly different world outlook, it makes sense. “My point,” said Alcock, “is they are not being irrational to believe that, any more than we are being irrational in many of the things we believe. Belief is finding things that fit into a pattern that we have already accepted. If something fits, it is believable.”

     Alcock gave another example: If a religious person sees the Virgin Mary in the garden, he may be surprised, but he knows Mary visits people and he feels blessed. Another individual, without such a religious outlook, may begin to examine his recent stress levels, sleep habits or diet. He places an entirely different interpretation on the same vision. “We constantly have to judge if something is real or not. Sometimes it is clear that some things are just silly; sometimes it is not so clear ... We are always taking in information and fitting it together with what we already know.”

Influencing Beliefs

    Authority. One of the main influences on our beliefs is authority. Early in life we are dependent upon what our parents and teachers tell us. What we come to believe is based, not on our rationality, but on our early acceptance of their authority. As adults, much of what we believe is due our trust in certain sources of information. We often have a gut feeling that something is true, but no empirical evidence of our own to back-up our belief. “Our beliefs are often based on faith in someone else’s ability to look at the data,” said Alcock.

    Anxiety Reduction. Many beliefs reduce anxiety and make us feel good. Religions are very good at this. Beliefs that help us make sense of the world will reduce anxiety and are more compelling to us than ones that do not. Prejudice can be driven by a desire to reduce our anxiety. If a racial group is believed to do poorly in our society because they have “bad genes” then we are unburdened of the responsibility for their problems. “We can accept this. It happens auspiciously ... because some beliefs are just so welcome to us,” said Alcock.

    Consistency. We screen incoming information for consistency with our basic beliefs about the world. “If you don’t believe that we can leave our body, if you don’t believe that people come back from the dead, then you are going to evaluate evidence differently than someone who does,” said Alcock. Studies have shown that our screening processes can be changed by changing our beliefs. For example, if someone who does not believe in paranormal powers can be convinced that, say, Uri Geller has special powers, the way that individual screens other related information will change. If the researcher later returns and proves to him that Uri Geller had only tricked him into thinking he had these powers, the typical response is surprise; but he is unlikely to change his basic belief in the existence of paranormal powers because, he will point out, there are so many other examples. He is unaware that he would never have given credence to any of those other sources had he not first been convinced by Geller.

    Social conditioning. Our social conditioning can have a profound affect on what we believe. Behavioral changes generally precede attitudinal changes. If an individual acts contrary to his beliefs, over time he will bring his beliefs in line with his actions. Before starting to smoke a teenager may say, “I’d never do that because I might get cancer and die.” After starting to smoke he will respond with comments like, “Well, driving a car is dangerous too.”

    Magical Thinking. A newborn’s brain is designed to make sense of the world using resemblance (do objects and events resemble one another?) and proximity in time (do two events occur together?). We are all born magical thinkers: We all learn on the basis of coincidence. If A precedes B, we conclude that A caused B. We think of Aunt Mary; the phone rings and it is she. While our rational mind can recognize that the timing of two events is purely coincidental, the feeling that there is a connection can be very emotional and difficult to ignore. Yet, by chance alone, all kinds of people are going to have these kinds of compelling coincidences.

     Alcock related the story of a physician who worked with low back pain patients and was a severe critic of chiropractors, whom he believed were practicing pseudoscience. One day after he was retired, he severely injured his back. Three months later the physician commented to Alcock that he thought maybe chiropractors were effective after all. He had gone in agony to a chiropractor following his injury and now his back was fine. As the physician related he story, however, he suddenly recognized his error. The pain hadn’t gone away immediately after seeing the chiropractor; it had gone away gradually over weeks - just as it might have had he not gone at all. He was attributing causality where perhaps there was none.

Emotion and Reason

     We learn through both our experiential system, from our emotions, and through our intellectual system, by rational thought and teaching. In this culture, we are taught to use contradiction as a reality check; we correct children when they make contradictory statements. At the same time, we are also taught that with certain subjects we are to distrust our rationality. When the two systems collide, as they do in beliefs about the supernatural, we are told to accept certain things on faith. We end up with two, often contradictory, sets of rules, the logical and the emotional. We tend to be dominated by our experiential system when we are disorganized: when we are tired, frightened, worried or if our rationality doesn’t seem to work. Many people may seem to act irrationally under stressful situations, but their actions make sense if you look at their early experiential history.

     Phobias and the gamblers fallacy are both examples of collisions between the intellect and the emotions. An individual experiencing a phobia can know a situation can not possibly hurt him, yet he still experiences the emotional distress. His intellectual component has little control over his emotions. The most straightforward way to deal with phobias is purely mechanical: slowly expose the individual to the situation, while helping him to stay relaxed. Intellectually, the gambler knows the roulette wheel has no memory, that statistically he should expect long streaks of good and bad luck, yet there is a compelling feeling that if red has come up 15 times in a row, black is doomed to come up next.

     “There is no reason to think that evolution selected for truth. We don’t have to know the truth to survive in the natural environment. Truth is a human concept,” said Alcock. “I am convinced that I have all kinds of beliefs that aren’t true, I just don’t know which ones they are. The mistake I think we often make is that we come to accept that something we hold strongly to be true, must be true. And then when faced with contrary information, rather than saying ‘that doesn’t fit,’ we tend to bolster our defenses and close out the information.”

Return to Archive Index
© 2000 Oregonians for Rationality