The field of psychology has only had a solid experimental basis after WWII, with the advent of randomized controlled experimentation (which also led to the rise of modern medicine, by the way). In the past couple of decades, it has had a boost from brain imaging technologies, which can objectively indicate differences among experimental subjects.
As part of the social sciences, psychology has been thought of as a “soft” science, with its conclusions therefore deemed suspect. There is nothing like a Nobel Prize to legitimize a science, and in 2002, a psychologist, Daniel Kahneman, won the Nobel Prize in Economic Sciences, for developing (along with Amos Tversky) ‘Prospect Theory’, which I will summarize later below. Prospect Theory became the foundation of a whole new sub-discipline in economics, “Behavioral Economics”, which is now one of the hottest fields in economics. University of Chicago economist, Richard Thaler (author of the fabulous book, “Nudge“), is credited to be the father of the formal field of behavioral economics, but some of it’s founding elements were taken from Prospect Theory.
I’ve noticed in the past several years, there has been an explosion of popular books on this new strand of social psychology that flow from this tradition, and share the same themes. Books such as: Blink, Switch, Nudge, Influence, The Power of Habit, Predictably Irrational, Unconscious, and dozens of other books of the same genre. I have read many of these (and recommend the above highly) and I think each of them have very interesting insights, though many explicate some of the same studies to make their points. I think there is “one ring to rule them all“, so to speak, in Daniel Kahneman’s 2001 book, “Thinking, Fast and Slow“. This is a summary of more than 40 years of his main research findings, presented in a very clear, almost breezy way. It isn’t as if the insights of all the other books are in here, but rather, Kahneman explains some fundamental aspects of our cognitive processes that make many of the findings in the other books logical extensions of what is discussed here.
Here is a link to Professor Kahneman giving an hour-long summary talk regarding his book to a Google brown bag audience:
Below is my summary of the 5 major insights I took away from the book:
Kahneman gives us a very helpful metaphor of two parts of our mind with which we think about things, System 1, which operates immediately and instinctively, and System 2, which judges and calculates. Kahneman makes it clear that this is a metaphor; it is not the case that there are literally two Freudian-esque parts of our brain, but rather that we have two distinct ways of processing things, but thinking of this as separate yet complementary systems makes this much easier to conceptualize. System 1 operates unconsciously and responds rapidly, such as when we have an immediate reaction to something we experience, performing an act of recognition, bring to mind associated concepts or images, and generate an initial evaluation of the object we are experiencing. We get all of this for “free”, in the sense that it requires no conscious effort and we cannot turn it off. System 2 then engages and takes the outputs of System 1 and accepts them, rejects them, or analyzes things further.
As an illustration, imagine that you are walking alone along an alley towards your car in bright daylight. As you pass a side passageway, you are startled to see someone there walking casually in your direction. Besides the initial moment of surprise, your System 1 notices that the person is a well-groomed man wearing nice clothes, and you make an instant snap judgment (this would be heavily conditioned by culture, of course) that you are probably safe from danger, so your physiological responses to the surprise taper down. That was System 1 that both registered the surprise and made the snap judgment. But then, System 2 kicks in and you realize that it is not impossible for a well-dressed person to be a mugger, and that’s it better not to take chances when you are alone, so you decide to quicken your pace to separate yourself from the stranger.
The two systems worked together: System 1 was immediate and effortless and provided some initial conclusions, which were then subject to System 2, which brings more rational faculties to the situation. Now how do we know our brains work using these two “systems”? Because System 2, unlike System 1, requires effort and System 2’s effectiveness can be diminished in various ways, easily done in experiments. System 2 can be impaired by distracting it with mental calculations, fatigue, alcohol, and exercising willpower. Subjects who are faced with a test of willpower (e.g. choosing an unhealthy snack), do worse when they are asked to perform challenging mental calculations at the same time, are deprived of sleep, or are under the influence of alcohol. And it is not just willpower that depletes in these conditions, as Kahneman says:
“People who are cognitively busy are also more likely to make selfish choices, use sexist language, and make superficial judgments in social situations.” (p.41).
These findings are pretty striking in the laboratory, but very reliably replicated. Upon reflection, I think many of us can see some of these effects in our own lives when we attribute grumpiness or rudeness to not having eaten all day or having had little sleep the past few days.
System 1 can be likened to a naïve child: it triggers associations, can be anchored by arbitrary information, and takes things it encounters at face value. System 1 is a bit of a simpleton. When it encounters something, associated ideas are conjured to mind (think of the web of vague and specific things that come to mind when in a conversation someone unexpectedly mentions the name of your childhood home town). System 1 can also be swayed by arbitrary things it is exposed to. For example, in one study, people were asked to think about the last few digits of their phone number and then to guess about some fact, such as the height of an average Redwood tree. Those who had lower phone numbers guessed dramatically lower heights than those who had higher numbers—even though there could be no rational connection between the two. The mere fact of exposing System 1 to the phone number influenced what it would guess for the unrelated question.
System 1 also is also naive: its first instinct is to believe whatever information is presented to it, and to take things literally. This does seem to explain why at some level very crude and direct advertising tends to work, such as associating a brand of beer with a carefree life surrounded by attractive models, or all the claims to be the “best”, “leading”, or “#1” product, even if made from obviously arbitrary criteria. Or even perhaps why clever commercials that make us laugh reflect well on the products involved in our minds, and our minds associate our enjoyment with the brand.
Let’s not be completely uncharitable to System 1, however. It is this reflex-like quality, the instant associations and the snap judgments that System 1 brings to the table that allow us to go about the vast majority of our day and interact with the world around us efficiently, and in a way that is usually correct and useful. We would be paralyzed if we had to rationally think through the implications from scratch of all of the things we are exposed to–we would continually be on tape-delay with respect to the world around us. It is cases where we don’t correct System 1 or don’t correct it properly, where we can get into trouble and experience biases and mis-judgments.
We are inherently pattern-seeking creatures, we jump to conclusions when we look for causes, and let the plausibility of stories we construct overly influence our judgment. Even with the faintest initial information, we immediately start to form stories. For example, let’s say I told you of a friend who got into an argument with his spouse one morning, and got into a small fender bender while driving later that day. Our minds naturally see a connection between the two with the former fact contributing to the cause of the latter. Now suppose I told you one of two additional facts: a) my friend can have a fiery temper at times, or b) my friend is prone to depression at times. Notice how easy it would be to incorporate either one of these facts: either my friend was angered from the fight and was driving recklessly, or my friend was depressed from the fight and was driving carelessly. In either case, we naturally jump to conclusions of causality based on the facts we have at hand. This tendency is likely innate in us, as research on infant perceptions that shows that six-month olds detect cause & effect: when shown videos of a simple shape colliding into another shape, six-month old infants express surprise when the second shape does not ‘bounce’ off of the collision.
In one experiment, Kahneman and collaborator Amos Tversky made a famous demonstration of the Representativeness Heuristic, which is a cognitive shortcut when we let how well a scenario seems to match our intuitive expectations overly affect our assessment of how likely that scenario is to be true. Subjects were given a description of a hypothetical person, “Linda”, a young bright woman who majored in liberal arts and was concerned with issues of social justice. Subjects were given 8 different statements about Linda’s current occupation and were asked how likely each of them were. Two of the statements were used to test whether people would fall into a logical fallacy, and, as has been replicated in numerous studies, most people fall into this trap. The statements are:
– ‘Linda is a bank teller and is active in the feminist movement’
– ‘Linda is a bank teller’
Most people rate the first statement as much more likely as the second because it seems to capture better the essence of the type of person we think Linda is, based on the details given. However, when you look at the two statements side by side, it is obvious that the first is a logical subset of the latter (there are more bank tellers than bank tellers who are active in the feminist movement) and so it could not possibly be more likely than the second statement.
I think we are broadly aware that these types of biases exist, as they are being talked about more and more, in the books mentioned above, and elsewhere. Many people are familiar with the ‘confirmation bias’ where we tend to interpret new information in a way that confirms our initial hypothesis. In fact, we often see this in operation with other people, such as when conspiracy theorists who interpret a new fact that reinforces their theories (“that’s what they want you to believe…”), or when we all acknowledge that it is important to “make a good first impression”. We know that, while it isn’t always fair, a first impression will tend to influence subsequent experiences (and may also determine whether those subsequent experiences will occur). If we try to be aware that these types of biases, we are more likely to see them in operation, but the truly challenging thing is to identify these biases in oneself—that is a very difficult thing to do. Kahneman even states of himself, after 40+ years studying heuristics and biases:
“My intuitive thinking is just as prone to [heuristics and biases] as it was before I made a study of these issues. I have improved only in my ability to recognize situations in which errors are likely.” (p.417).
What can we do about this, except to try to be cognizant of areas we may be prone to biases? Well, if you have the opportunity to ask for honest feedback from others, you should take advantage of that opportunity, listen to it very sincerely, and try to ask yourself deeply whether any of the constructive feedback might have hit the mark. Having spent time in management consulting, I have been the giver and recipient of frequent feedback on work performance and propensities, and in my experience, the vast majority of this feedback has been accurate, humbling, and helpful.
When making decisions we are usually risk-averse but this is heavily dependent on how we frame the decision. It is now established common knowledge that most people are risk-averse. Losses hurt more than gains please. If offered a bet on a coin flip, with heads = you get $120 and tails = you pay $100, most people wouldn’t take the bet, even though the expected value is +$10. However, this isn’t the whole story. It turns out that there are situations where we are more risk-taking: when we feel we are already in a loss zone. Let’s say that you just finished eating at a fine restaurant and your bill came to $80. Then the restaurant manager offered you a bet to modify the bill on a coin flip, with heads = you pay $40, and heads = you pay $100. ‘Wow, an equal chance of getting my bill cut in half or paying an extra 20 bucks? I’ll take it’, you may say. And yet, the expected value here is the same as the first example above (50% chance x $40 + 50% chance x $100 = $70, which is +$10 vs. the original bill). In fact we see this effect around us when, for example, people in trouble take big chances (think about the car chase scenes caught on film, at what point does the criminal think, “It’s better to take a chance with my life driving on the wrong side of the freeway, than getting caught for sure and going to jail for 5-10 years…”). I also have personally experienced some irrational risk-seeking when faced with losses. I noticed that when I play Blackjack at a casino, when I start to get down in chips my bets become bigger and bigger, in the hopes of gaining back the large losses…thus the last 25% of my stack usually disappears in the last 2% of my play time. Also, on one occasion when I stock I held collapsed to only a couple of dollars, I despondently bought more of it, thinking “Well, I’ve lost so much…but now on the off-chance it does bounce back, I want to make sure it results in a big reward”—needless to say that was good money thrown after bad. (Okay, with these two examples, I think I just disqualified myself from a career as a financial advisor.)
Here is Kahneman & Tversky’s famous example of framing effects:
Imagine that the United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill 600 people. Two alternative programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows:
(A) If Program A is adopted, 200 people will be saved
(B) If Program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved
You can now make your choice on which of the two you would choose between (A) and (B). Now, imagine that you had to choose between these two programs instead:
(C) If Program C is adopted, 400 people will die
(D) If Program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die
Which of these two options, (C) or (D) would you choose?
Most people, in the first scenario, choose A, and in the second scenario choose D. But if you look closely, you will see that (A) is equivalent to (C) and (B) is equivalent to (D), they are just expressed as either gains or losses. Thus, in choosing (A), most people are risk averse when the outcome is framed as the lives to be gained, but when choosing (D) most people are risk-taking when the outcome is framed as the lives to be lost.
Here is the main diagram illustrating Prospect Theory, which led to Kahneman’s Nobel Prize and helped to found the field of Behavioral economics:
(Source: en.wikipedia.org)
The diagram is rather simple, but there are three things it is claiming:
- The curves flatten out as they go out to the right or left: perceived incremental value diminishes the larger they are
- The curve above zero is shorter than the one below zero: losses loom larger than gains
- The reference point is not $0 but some framed reference point
#1 implies diminishing returns, #2 implies risk-aversion, and #3 shows that reference points matter. In these simple terms, these things seem rather obvious, but now, thanks to Kahneman and others, we have abundant empirical evidence demonstrating these types of effects, which is leading to a much richer understanding of how our minds work.
A final quote from Kahneman:
The attentive System 2 is who we think we are. System 2 articulates judgments and makes choices, but often endorses or rationalizes ideas and feelings that were generated by System 1” (p.415).
I think the metaphor of the two systems is a helpful way to think about how our mind works. Many of these and similar findings have been corroborated and expanded upon as we are learning more and more about how our minds work. Thanks to a slew of popular books in the press, many of these insights are reaching the general public, and are being passed into common knowledge. The field of modern cognitive science only started in the middle of the 20th century, and as it progresses (with the aid of neuroscience and brain imaging technologies, etc.), we should start to see an explosion of knowledge about how our minds work.
(All quotes above cited with a page number are from Kahneman, Daniel, Thinking, Fast and Slow, Farrar, Straus and Giroux, New York, 2011).
Categories: The Human Mind and Psychology
Tags: biases, heuristics, Kahneman, mind, psychology, thinking, Thinking Fast and Slow
5 replies ›
Trackbacks
- The Remarkable Power of Small ‘Nudges’ – Applied Psychology in Action | Charlie's Blog - To Notice and to Learn
- How I Blog a Summary of a Nonfiction Book | To Notice and to Learn
- Group-ism: the larger problem behind racism, nationalism, and many other -isms | To Notice and to Learn
- Where intuition falls short–example: The discipline of finance | To Notice and to Learn
- Thinking Fast & Thinking Slow | Shadrach Pilip-Florea