Every once in a while I read a book and recommend it to everyone with whom I communicate for the next couple of weeks. (Someone recently teased me that on my tombstone would be written: HAVE YOU READ . . . ?) The latest book of my affection is Mistakes Were Made (but not by me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts by Carol Tavris and Elliot Aronson.
Today I was happy to learn about and listen to this NPR interview of Dr. Elliot Aronson. (Also at that link is an excerpt from the book.) Tavris and Aronson have written a thorough coverage of the phenomenon of cognitive dissonance. That's the feeling in your brain when you find that you are holding two inconsistent thoughts or beliefs; it's like an itch that needs to be scratched. Aronson in the interview says resolving the dissonance is a drive like thirst or hunger. The book describes the lengths we will go to in order to achieve consonance — lengths which can be mind-boggling, laughable, or dangerous.
For example, let's say that you consider yourself a bright and savvy person and you do something, well, a bit dumb.You now have yourself some dissonance. How do
you scratch it? The chances of your modifying your self-concept are low. You must do something with this incident of bungling. Ah, hah! You can revise your opinion of it! It was not so dumb after all. In fact, of decisions you have made, this may have been one of the wisest. Don't laugh. We all resolve dissonance and our methods may be just as slippery.
Because the search for consonance is not at a conscious level, it presents many problems described in the book. Perhaps we may not learn from our mistakes. Or we may take the first step on the pyramid of choice which can seductively lead us to do things that are inconsistent with our own or society's values.
In explaining the pyramid of choice, the authors used the example of two students taking an exam to get into grad school. Each of them has the same attitude towards cheating: It's not good but there are worse evils. They reach an exam question for which they have forgotten the answer. Each has the opportunity to cheat by reading another person's answers. One cheats and the other does not, but each was a close decision. One has given up his integrity and the other has forfeited a good grade. In addition, one has gained a good grade and the other has maintained his integrity.
Now the question is : How do they feel about cheating a week later? Each student has had ample time to justify the course of action he took. The one who yielded to temptation will decide that cheating is not so great a crime. He will say to himself: "Hey, everyone cheats. it's no big deal. And I really needed to do this for my future career." But the one who resisted the temptation will decide that cheating is far more immoral than he originally thought: "In fact, people who cheat should be permanently expelled from school. We have to make an example of them."
By the time the students are through with their increasingly intense levels of self-justification, two things have happened: One, they are now very far apart from one another; and two, they have internalized their beliefs and are convinced that they have always felt that way. It is as if they started off at the top of a pyramid, a millimeter apart; but by the time they have finished justifying their individual actions, they have slid to the bottom and now stand at opposite corners of its base. The one who didn't cheat considers the other to be totally immoral, and the one who cheated thinks the other is puritanical. This process illustrates how people who have been sorely tempted, battled temptation, and almost given in to it — but resisted at the eleventh hour — come to dislike, even despise, those who did not succeed in the same effort. It's the people who almost decide to live in glass houses who throw the first stones.
The metaphor of the pyramid applies to most important decisions involving moral choices or life options.
The authors provide many examples of both the pyramid and of cognitive dissonance. One that may be of particular interest to idealawg readers: DNA evidence exonerates a person languishing in prison but the prosecuting attorney will not reopen the case. In many instances, Aronson thinks the cause for not reopening is cognitive dissonance. It would be unbearable for a person who thinks of him- or herself as a good person to be responsible for a wrongful imprisonment so they may deny the veracity of the DNA evidence.
One way to make it less likely that we are governed by cognitive dissonance is simply to become aware of the phenomenon and how it operates slyly and silently. Reading the book will certainly raise your awareness and thus will likely result in your improved ability to make decisions that are wise. Have you read it yet? May I suggest that you do?
Hat tip to PsyBlog for the NPR interview alert.
Note (added July 30, 2007, 10:29 AM Mountain): A related post at lifehack.org: Success Tips: Why you should broaden your patterns of thought. Excerpt . . .
People’s thought patterns focus naturally on the areas where they have learned most, gathered most experience, and feel most at home. They act like blinders or mental filters, presenting you with a neat picture of the world, tuned to your biases and assumptions. What you see as the truth is only what they let you see. What you do as a result may therefore be seriously flawed, as well as limited.
Note (added September 11, 2007, 10:45 AM Mountain): Review of Mistakes Were Made in American Scientist. Tip of the hat to Deric Bownds' Mindblog.
Sometimes its cognitive dissonance and denial at work - at other times, as Freud would say, Sometimes an innocent person is just an innocent person.
Posted by: ArLyne Diamond | September 12, 2007 at 11:41 PM