A few months ago I read a most intriguing, intellectually challenging book that provides incredible insight on how people think. Written by Daniel Kahneman, a psychologist who won the 2002 Nobel Prize for Economics, Thinking, Fast and Slow is an amazingly informative read if you’re remotely interested in how people think and make decisions, and why. Based on Kahneman’s several decades of professional research, the book is also an incredibly difficult read that will make you painfully aware of just how lazy your System 2 can be. So what does this all have to do with my insulting imperative in the title? Did you catch the part about him being a psychologist who won the Nobel in Economics? In part it’s because Kahneman’s research stood all of modern accepted economic theory on its ear. See, economics is based on an assumption that humans are rational decision-makers. Kahneman proves that assumption is glaringly false. You are irrational (and so am I). Worse than that–we like it that way.
So what, you say? Well, for starters, your irrationality causes you to make some really poor decisions. I won’t attempt to try to summarize all of a 498 page book on how people think, but one concept Kahneman and other students of decision-making have explored is extremely relevant today. I’ve been struggling for quite some time to understand why large numbers of extremely intelligent people who I know (and many more that I don’t) can be so easily deceived by information, statistics, and sound bites that are sensational and powerfully emotive, that with minimal basic research will prove to be manipulatively misleading, if not totally inaccurate. Kahneman, and those who have built upon his research, have given description and supportive research to a behavioral bias that I have observed on my own: our human tendency is to look for evidence to support what we already believe, and to avoid or discount any evidence which contradicts our beliefs. Referred to in decision-making as the supporting evidence bias, variants also appear in scientific research (the systematic positive bias).
Two (potentially) big problems here:
1) WYSIATI: Kahneman identifies a phenomenon in thinking that he refers to as “What You See Is All There Is.” When operating in our “fast-thinking” mode (System 1), we tend to make snap, intuitive decisions based on the information readily at hand, as if that’s all the information that exists. Unfortunately, it’s those things that we don’t know about which can often cause us the most harm (as former SecDef Donald Rumsfeld so famously and accurately opined).
Example: A school teacher living in the Midwest earns $30,000 per year. In an effort to improve her standard of living, she seeks teaching jobs in other parts of the country, and takes a job teaching in Bush Alaska paying $60,000 per year. While expecting to double her income and her standard of living, she fails to take into account that the cost of basic necessities in Bush Alaska can be double to quadruple the cost of the same items in the Midwest. By failing to consider the unknowns, and failing to seek evidence to disprove her assumptions, she inadvertently lowered her standard of living.
2) Failing to seek contrary information can lead to inaccurate results: A more clinical example here.
“Students were given the sequence of numbers 2, 4, 6 and told to determine the rule that generated the numbers. To check hypotheses, they could choose a possible next number and ask whether that number was consistent with the rule. Most students asked whether a next number “8” would be consistent with the rule. When told it was, they expressed confidence that the rule was, “The numbers increase by 2.” Actually, the rule was, “Any increasing sequence.” A better test would have been to check whether a next number incompatible with the hypothesis (e.g., “7”) was consistent with the unknown rule.”
In other words, you’re irrational. But so am I, and Dr. Kahneman even acknowledges that he is too. Now you’re aware that you have a problem-what are you going to do about it? Here’s what I have done:
1) Make it a habit when facing difficult, important, or costly decisions, to deliberately seek to disprove my preferred position. This is incredibly difficult to do, but it’s powerful. First, though, I have to acknowledge I have a bias, and make a concerted effort to set the bias aside while I attempt to prove the opposite.
2) Deliberately seek experiences that are outside my comfort and familiarity (within reason here…). Read things I disagree with. Listen to both Fox News and MSNBC. Better yet, seek primary sources to understand the full details of what was said or written, not just the selectively re-broadcast sound bites. Cultivate friendships with people who believe differently than me. Notice I said “friendships.” They’re my friends. I love and respect them. I get to know them for who they are, and I appreciate them. Then when I find myself in disagreement with them over an idea, I can more readily seek first to understand and appreciate their position, even if I still disagree. Sometimes, I even find out that I’m less than fully informed!
3) Assume that I don’t have all the information, and when I find that I am dumbfounded by the stupidity of others who can’t see the solution that is so right and obvious to me, recognize that I am almost assuredly operating under WYSIATI.
I’ll leave you with what is arguably my favorite quote from almost 500 pages of deep, intriguing thoughts: “We know that people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers” (Kahneman, 217).
Supporting evidence bias is inherently irrational, and it’s hurting our culture today. I want to think better.