Back in the late 1950s, a couple of researchers designed an experiment that was meant to measure a person’s cognitive dissonance. The subjects were put into a room, where they sat in front of a box full of a dozen spools, each sitting upright in rows of three. They were told to take the spools out of the box in a certain order, and then to put them back into the box as they were before. Then to do it over and over again until they were told to stop.
Next, the subjects had some wooden cubes placed in front of them in rows. They were asked to take each cube, and rotate it clockwise one turn. And then they were asked to do it over again with each cube, and to keep repeating that until they were asked to stop. Both tasks were by design boring and tedious, and took about an hour.
Finally, they were to fill out a survey on a questionaire about the experiment they had just performed, which asked them to rate the experiments on a scale of between -5 and 5. They were asked the following questions:
1- Were the tasks interesting and enjoyable?
2- Do you think the tasks have any scientific value?
3- Would you have any desire to participate in similar experiments in the future?
Just before they sat down to fill out this questionaire, however, the subjects were asked to bring in the next participant. They were told they needed to tell them certain things, and it was normally the job of someone else to do this, but that person was out sick. They were told to tell the next participant that the experiment was very enjoyable, very fun, very interesting and exciting. In a nutshell, they were told to lie. In exchange, they were given some money for doing this. Group A was given $20, and group B was given $1, and the control group — which performed the tasks and filled out the questionnaire — was not compensated at all.
Each participant brought the next person into the experiment room, told them what they were instructed to say, and then filled out the questionaire on what they themselves thought of the survey.
The results of the study were interesting. As expected, the group which received $20 gave a more favorable score than the control group. This makes sense, because it seems easy to rationalize a white lie for $20. Even if someone doesn’t believe what they are saying, it’s a victimless act. Still, one might have the need to rationalize this white lie, at least to a degree, and they convince themselves that it wasn’t really as boring as they had imagined.
The $1 group is the most interesting. They actually rated the experiment the highest in terms of how interesting, fun and exciting it was; and the difference between them and the the other groups was big. This doesn’t make a lot of sense on the surface, as it would seem that those with the greatest amount of compensation would have rated it highest. After all, a buck is hardly worth compromising one’s integrity to obtain, and it’s almost no different than receiving nothing. Logic would dictate they would have scored the experiment closer to those scores from the control group. So, what gives? Why was there such a stark difference between the $20 and $1 groups?
Remember, this experiment was designed to measure dissonance, not motivation. Stretching the truth for a price makes sense, and can even fit within the values that we have as a society. I mean, does anybody really believe Tiger Woods’ favorite car is a Buick, or Martha Stewart’s linens are the same ones she sells at K-Mart? Probably not. I prefer Coke, but I’d happily do a Pepsi ad if someone paid me. Hell, I’d endorse the taste of warm piss if I were paid enough – and I wouldn’t feel too guilty about it, either. But that still wouldn’t convince me that Pepsi is better than Coke. I may not announce my preference in public, but I could confide my real preference to my friends. Those people making the $20 weren’t compromising their values, because they weighed the option of taking the money in exchange for the lie, and determined it was worth doing. They knew they were lying, but it was OK.
The $1 group had a different problem. Stretching the truth for a buck is like lying for no reason, and that made it difficult to fit within their value systems. So, if they couldn’t change the compensation amount, they did the next best thing — they changed the truth. They rated the tests as interesting because they believed it. Our natural inclination is to convince ourselves something must be true, even if reality is slapping us in the face. To paraphrase George Costanza, it’s not a lie if you believe it.
Informational social influence is the term used term to describe how individuals are influenced by social groups. A number of experiments have been done to demonstrate social influence. The Stanford Prison Experiment is one of the more famous ones, and illustrates how individuals are likely to conform to peer pressure in doing things they know are morally wrong. This type of pressure is illustrated in everyday life, from school bullying to the telling of racist jokes – and all of us are at least to some degree subject to its effects. But with these things, once we become detached from the immediate situation and take time to look at and evaluate our actions, we’ll most likely come to a “what was I thinking?” conclusion. Our values and beliefs don’t necessarily change, but we simply become subject to a moment of temporary insanity.
Another series of well-known experiments were the Asch conformity experiments, in which subjects were pressured into conforming to beliefs they knew were false. What is interesting about these experiments is not just the fact that people were made to change their opinion to something they knew to be false, but also how they conformed to the larger group and became part of those who influenced others with a dissenting (and correct) opinion. These subjects quite literally changed their perception to one which was false, so when they talked the others into changing their minds, they weren’t being duplicitous. Like George Costanza, they made themselves believe the lie. Informational influence turns peer pressure into conformity, which reverts to more to peer pressure, and insufficient justification turns doubt into true belief. By expressing a belief and having it reenforced by a larger group of peers, it eventually becomes our own. This explains how two million AAs can be wrong, and why the argument that there are so many believers (read: argumentum ad populum) means nothing in terms of whether it is an effective program. Once a delusion reaches critical mass, it’s as likely to become common wisdom as it would if it were reality.
Religious groups use personal testimony as way to solidify personal belief, and as way to change the belief in others. AA’s version of doing so is with the drunkalog. Ilse wrote this awesome piece on the drunkalog, in which she equates it with the early American conversion narratives. She writes:
“These days, it seems, the conversion narratives is hardwired. We all know how it goes:
Innocence, Complacency, Debasement, Grace, Redemption, Transcendence .
And we no longer need a member of the clergy looking over our shoulders to make sure we get it right. The self-censorship within communities is extremely effective in making sure that these stories remain true to form. Furthermore, to this day, the conversion narrative serves exactly the same purpose as the old time Captivity Narratives: To instruct (and affirm the teller’s place in the community) and to redeem the teller, to bring him or her ‘home.'”
Those of you who have spent time in AA have witnessed this formulaic expression of transcendence many times, and have almost certainly used it in conveying your own experience. Those of you who have not (and who have a sadistic bent toward self-harm) can log into any AA speaker site and hear these testimonials. Drunkalogs are an essential tool in concretizing a spiritual awakening, and bringing hope to new members. That is, in fact, their stated purpose – but they do more than that. Their other — and I think more essential — purpose is in convincing the storyteller of the truth, by which I mean the truth as the others in the group see it. Goebbels once famously said that if you repeat a lie enough times, it becomes the truth — and he was right.
What the experiments on dissonance demonstrate is that the greater the amount of disbelief, the larger the lie — and the stronger the belief becomes once the belief is adopted. The dollar group were the strongest advocates of how enjoyable the experiment was because their lie was the most incongruent, and a mere dollar left no wiggle room to rationalize that lie. Once someone crosses the Rubicon to belief from disbelief, it’s difficult persuade them otherwise, and the more ridiculous the lie, the more more solidified they are in their opinion. As an example, this explains why it is so difficult to get an AA to admit that a program oozing with God and divine intervention into the lives of its individuals, is religious. What sounds to you like a lie is simply their truth.
If you were under the AA spell, it also became your truth. That truth, even when shown to be false, was difficult to shake. Here’s why:
If I were to ask you which personality trait makes a more effective firefighter, a maverick or a conservative, most of us would not know because we know nothing about fighting fires. We can guess, but it would be only that: a guess. When a control group was asked this very question, most had no idea, and the group leaned only slightly toward guessing it would be the risk-taking firefighter. Now, suppose you were handed a document with research showing the maverick firefighter was more effective. In that case, you would have a strong reason to believe the maverick would be the better choice, and a group that was asked this question after having read such a report did just that. They overwhelmingly chose the risk-taking personality. This makes sense, because the group had supporting data to justify their opinion.
Next, the group was told that the report they had read was a fabrication, and the data within it was written by an undergraduate student who pulled the information out of thin air. Then they were asked the very same question again. Now, logic would dictate that they would revert back to their default position of “I don’t know.” After all, they still know as little about the subject as they did when they were originally asked the question. So, did this affect their opinion? No. In fact, even after being given real supporting data showing that a more conservative firefighter is in fact the more effective type, the majority kept the opinion that was based on bogus data.
This phenomenon of sticking to our guns, regardless of any contradictory evidence proving our opinion to be false, is known as “belief perseverance.” Political pollsters exploit this with techniques like push-polling and negative advertising. Casey Anthony’s attorneys exploited it in their defense, when in their opening remarks (which are not allowed to be considered as evidence in the case) they stated that Casey’s daughter died as a result of a drowning accident, and that her father was somehow involved. No evidence showing this as a possibility was ever introduced by the defense, and their promise to do so was ignored by jury, some of whom cited an accidental drowning as the probable cause of the child’s death.
Belief perseverance is the glue for all kinds of crazy, from conspiracy theories and urban legends, to global warming denialists and the 9-11 truth movement. Its consequences aren’t always insignificant. For example, despite the fact that the link between autism and childhood vaccines has been debunked from every authoritative body, there is still a large percentage of the population who refuse to allow their children to be vaccinated. My wife, who is an infection control coordinator, is forever frustrated by parents who refuse to allow their children’s vaccines, even after being presented with the real data, and with the information showing the autism link was a fraud. These things can cost lives. Regardless of the truth, there will always be a contingent of people who will believe the world is flat.
Belief perseverance is more pronounced in those who state their opinions publicly, which makes the drunkalog such an effective tool in pulling a person further into the AA labyrinth. AA will advise people to “fake it ’til they make it,” and they do so for a reason. Not that most steppers even know the term belief perseverance, but they have seen displayed time and time again how easy it is to get someone to drink the Kool-aid if they can simply pretend to believe the dogma.
Connecting The Dots
Take a look at the video below. What do you see? If you’re like most people, you see some sort of story playing out between the circle and the triangle. Maybe it’s an affair between little triangle and big triangle that went horribly wrong. Maybe the big triangle is a landlord who is trying to get some rent from his deadbeat tenants, and in frustration tears up the house. There’s a number of stories which can be applied to the movements, but in actuality it’s nothing more than random movements of different shapes.
Our tendency to attribute human characteristics to such things is a process known as “anthropomorphization.” Personification of the external world is an inherent part of our nature, and is rooted in our social neurology. It’s perfectly normal, and in fact, when this video was shown to people with autism, they tended to describe it in terms of geometrical movements, and without attributing human qualities to the moving shapes.
So, what does this have to do with Alcoholics Anonymous? Remember, because we have a fundamental need for cognitive closure, and latch on to whatever explanation comes first – and as demonstrated in the study using the firefighters — it does not need to be the correct answer. Add to this our tendency to see the world in terms of emotionally motivated characters (as demonstrated in the Heider and Simmel experiments). In other words, we have an emotional need to explain things of which we are uncertain, with the conscious acts of human-like agents. Michael Shermer refers to this as “agenticity,” which he wrote about in recently published book, The Believing Brain.
Shermer also coined another term: patternicity, which is used describe our tendency as humans to find patterns in randomness. This happens both visually, as with the above picture of a sink, and in the pattern to the right – where you may think you see a rectangle, but what you’re doing is filling in the gaps with something that’s not there – and with our thinking processes.
“Traditionally, scientists have treated patternicity as an error in cognition. A type I error, or a false positive, is believing something is real when it is not (finding a nonexistent pattern). A type II error, or a false negative, is not believing something is real when it is (not recognizing a real pattern—call it “apatternicity… our brains are belief engines: evolved pattern-recognition machines that connect the dots and create meaning out of the patterns that we think we see in nature. Sometimes A really is connected to B; sometimes it is not. When it is, we have learned something valuable about the environment from which we can make predictions that aid in survival and reproduction. We are the ancestors of those most successful at finding patterns. This process is called association learning, and it is fundamental to all animal behavior, from the humble worm C. elegans to H. sapiens.”
This tendency to see patterns is not a bad thing, at least in terms of survival. We think this way because we evolved this way. Those who were more inclined toward false positives were more likely to survive danger. The twig you hear snap behind you may be a false alarm ninety-nine percent of the time, but the one time it is really a predator, it helps to be prepared. Because of this, danger and fear help to heighten this instinct, which is why you’re less sensitive to footsteps behind you if you’re walking through the mall, than you would be if you were in a dark alley. If you’re fearful of a disease, for which a human form (human-like agent) has been applied (a guy doing push-ups in the parking lot), then your senses are heightened, and your inclination to see patterns of danger is more pronounced. Looking for validation from your AA brethren for your fears, is not difficult. You will find your patterns. Drinking patterns, disease progression patterns, patterns of fundamentally flawed behavior will all be there. They may not be real, but they don’t need to be.
The problem arises when we couple our tendency to see patterns of behavior when there is none, with our instinctive need to see human-like behavior where there is none. Convincing someone that good things are happening as a result of attending AA and utilizing the steps, and that it’s a result of turning things over to God as they understand Him (with human motivations and logic), is an easy sell. We’re hardwired to do this anyway, just as we’re hardwired to fill in the gaps of ignorance with the first explanation that comes along, and we’re hardwired to stick with our beliefs despite any contradictory evidence showing otherwise.
Consider the following experiment, where subjects were given feedback on their own psychological profile which rated them in terms of anger as either normal or angry. The profile itself was bogus, because what was important was not the subjects actual personality traits, but how they perceived themselves.
Next, the subjects were give a personality description of a guy named “Donald.” It was written to be intentionally ambiguous in terms of his anger profile. Then the subjects were asked to rate Donald on multiple personality traits, including ones which identify anger (hostile, irate, etc.). What were the results? As you could guess, the group which was identified as angry, also identified Donald as angry; and the group which was identified as normal, also identified Donald as normal.
This is a phenomenon is known as “projection.” To add insult to injury, not only does projection cause a person to project their own attributes onto others, but it causes the person doing the projecting to diminish their own shortcomings. It (ironically) creates denial.
Let’s take a look at another experiment which was given to test subjects just like the one on anger, only this one was about honesty. Subjects were given a psychological profile of themselves which described them as either high or low on the honesty scale. Then some of the subjects who scored high on the dishonesty scale were asked to evaluate another person’s ambiguous personality profile. Once again, this group projected their dishonesty onto the third person. Another part of the group who scored high on the dishonesty did not get to evaluate the third person, which meant they had no opportunity to project their traits onto the other person. Then each group was asked to self-evaluate.
The results were that the group which was allowed to evaluate another person on their honesty, and therefore allowed to project their traits onto others, scored much lower in terms of anger on their self-evaluation. In other words, the projection served its purpose as a defense mechanism, and in helping people feel better about themselves.
As an outside observer, projection is the psychological phenomenon I see played out most often among the members of AA. It explains why the craziest of steppers can be so self-righteous and judgmental, and it explains the findings in a study which showed that AA has a positive impact on those who sponsor, and a negative one on those who are merely sponsees.
It’s important to note that, in these experiments, the traits of anger and dishonesty which were applied to the subjects were given at random. They may or may not have been above average on the anger and dishonesty scales, but it doesn’t matter. What mattered was their self-perceptions, and whether or not they believed they scored high on these traits. In AA, it is presupposed that an alcoholic – a real alcoholic – possesses these same negative attributes. They are applied to individuals the moment that they walk through the door, because these character flaws are attributes of alcoholics. Once a person is led to believe that they are angry, resentful and deceitful – which they will be because these are symptoms of their disease – they will project these traits onto the new sheep, and cycle is continued. The second you walked through that door; you unwittingly become part of that cycle.
In AA, the anger which people are using the program to rid themselves is likely projected onto them by the very people who are offering up the 12-step serenity cure. It’s both the cause of and solution to what makes an alcoholic an alcoholic. The problem is that the solution to the anger, whether it is real or not, is projection. And projection can only be had if another sap comes along to serve as its recipient, and you will only have access to that solution if you stay in AA. It’s whacked-out circle-jerk of never-ending crazy.