Author: **Daniel Kahneman**
_Daniel Kahneman_
Reading time: 33 minutes
Synopsis
Thinking, Fast and Slow (2012) looks back at the scientific work of psychologist Daniel Kahneman. His research has greatly changed how we understand the human brain. Because of his work, we now better understand why we often make mistakes. We also learn what to think about when making choices.
What you will learn: Understand human behavior.
“The publishing of this book is a major event.” This sentence comes from the famous Harvard psychologist Steven Pinker. The book he was talking about is Thinking, Fast and Slow by Daniel Kahneman. It became a bestseller right after it was published in 2011.
And the excitement was deserved.
The book shared a surprising idea: Kahneman says our thinking is not all the same. For a long time, people thought most thinking was conscious and logical. But this is not true. Instead, we often think by instinct. But this way of thinking can easily lead to mistakes. It makes us make wrong choices and judgments. Luckily, there’s a way to avoid these problems. You will learn about it in these summaries.
You will learn:
- why you are more creative when you are in a good mood,
- how lazy your brain can be, and
- why you sometimes make bad decisions.
Blink 1 – Fast and Slow Thinking
Let’s start with a small puzzle. Try to solve this problem: A bat and a ball together cost 1.10 Euros. The bat costs one Euro more than the ball. How much does the ball cost?
What answer first comes to your mind?
Many people answer “10 cents.” If you also gave this answer, we must tell you it’s wrong. If you take the time to calculate, you will find the right answer: The bat costs 1.05 Euros, and the ball costs 5 cents.
In this thinking process, you just learned about your brain’s two systems: The first answer, “10 cents,” came from your intuitive and automatic System 1. Only when you thought more carefully, you used your System 2. System 2 is analytical, conscious, and logical. How these two systems work together shapes how we think. It affects our judgments, decisions, and actions. Let’s look at both systems more closely.
System 1 is the automatic system. It works fast and often without us trying. For example, if you hear a loud noise suddenly, System 1 takes over. This makes you immediately and automatically focus on where the sound came from.
System 1 comes from evolution. In the past, it helped us survive. It let us make quick judgments and act fast, almost like an autopilot. For example, if our ancestors heard a loud roar, they had to react very quickly and run from a saber-toothed tiger. There was no time to think.
But this fast reaction speed of System 1 also has downsides. It often makes us believe there are causes and effects where there are none. Its strength is feelings, not statistics.
System 2 is very different. We use this system when we think deeply and focus our attention on something. It means self-control and carefully made decisions. If you want to find a certain person in a crowd, you can rely on System 2. It blocks out distractions and helps you focus on specific features.
And this back-and-forth between intuitive System 1 and logical System 2 creates human behavior.
You will learn how this interaction works and what can go wrong in the next summaries. By the way, these two systems are not found in a certain part of the brain. Kahneman and other psychologists use these terms to simply explain how thinking processes work.
Blink 2 – Imagine your brain as a sloth
Let’s look more closely: What exactly happens when people answer the bat-and-ball problem with “10 cents” – which is wrong? It’s simple: The impulsive System 1 took control and gave a quick answer. Usually, System 1 asks System 2 for help when a task is too hard. But the bat-and-ball problem tricked System 1 because it seemed very easy at first. So, System 1 thought it could solve the problem quickly by itself.
Why did this happen? It’s because our body tries to use as little energy as possible. This idea, which applies to both physical and mental tasks, is called the Law of Least Effort. Checking the answer with System 2 would use more energy. So, our brain saves effort if it thinks System 1 can solve the question alone. Sometimes this is good. But for the bat-and-ball problem, it was the wrong way. If System 2 had worked, the mistake would not have happened.
But you can also control your brain’s energy use on purpose. If you deal with information that is presented in a complex way, System 2 starts working. It’s almost strange: If the bat-and-ball problem were shown in hard-to-read letters, more people would give the correct answer. Then the brain leaves the creative, happy, and intuitive state of cognitive ease. It realizes it needs to work harder.
So, if you want to convince other people of something, use easy-to-read fonts that stand out from the background. Speak or write clearly. Then the receiver won’t have to work hard to understand the message. They can stay in a state of cognitive ease. They won’t even think about using System 2 to question you. But if you want the receiver to think deeply about your message, then write a bit unclearly on crumpled paper.
So, we learn: System 2 gets active when faced with challenges.
Our brain is simply very lazy. No wonder it has found ways to avoid hard thinking. In the next summaries, we will explain when System 1 reaches its limits with these strategies.
Blink 3 – How our actions are secretly influenced
What do you think of when you see the word part “S_ _ _E”? Probably nothing at first. But what happens if you first look at the verb “EAT” (ESSEN in German)? If you now look at “S_ _ _E” again, you might see the word “SOUP” (SUPPE in German). This process is called priming.
We are ‘primed’ when seeing a word, idea, or event makes us think of related words and ideas. If you had read the word “SHOWER” (DUSCHE) instead of “EAT” (ESSEN), you would probably have completed the letters to “SOAP” (SEIFE).
Such priming processes don’t just affect our thoughts. They also affect our actions and behavior. Even the body can be affected by priming. A good example is a study where people were primed with words like “FLORIDA” (a retirement state in the USA) and “WRINKLE” (FALTE). After this, these people actually moved slower than they usually did.
Priming processes affect us without us knowing. They show that we don’t have full control over our actions, judgments, and decisions. In fact, social and cultural factors constantly prime us. And where does this priming happen? Right, in System 1. That’s why we cannot control it. This is dangerous because System 2 still ‘thinks’ decisions are based on its logical thoughts. It doesn’t realize they are due to System 1’s priming.
So, social priming can affect a person’s thoughts. This also changes their decisions, judgments, and behavior.
And this affects how we live together. For example, marketing researcher Kathleen Vohs showed that the idea of “MONEY” primes certain actions. People who see pictures of banknotes act more selfishly. They care less about others. This should make us think.
Blink 4 – (Time to take a breath)
Quote from the book: “Most impressions and thoughts come into our mind without us knowing how they got there.”
Blink 5 – Making Our Own World
Imagine you meet Ben at a party. You have a very nice talk with him. A few days later, someone asks if you know anyone who would donate to a good cause. Suddenly, you think of Ben. And this is even though you only know he is a great person to talk to.
We often form opinions about people even when we know little about them. Here too, our brain tends to simplify things. It makes judgments without enough information. This leads to wrong ideas. This is called exaggerated emotional coherence or the halo effect: You think Ben is a “good person,” but you really know too little about him.
System 1 wants to make the world simpler than it is. Ben is nice? Then he must also be smart, generous, and have other good qualities. This also works if you first learn a negative quality about a person. You will then see the whole person in that light.
But this is not the only shortcut the brain takes to make quick judgments. There is also the confirmation bias. This means people tend to agree with suggestions and earlier beliefs. This happens, for example, when we ask a leading question. A study showed that people tend to imagine someone as friendly if asked, “Is James friendly?” without any more information. The reason: The brain automatically confirms such suggested ideas.
In general: When the brain lacks information, it judges too quickly based on what it already knows.
The halo effect and confirmation bias both happen because the brain likes to make quick judgments. These mental effects, like priming, happen without us knowing. They influence our choices, judgments, and actions. If you want to avoid wrong judgments: Look carefully and ask yourself if you really have enough information to answer a question.
Blink 6 – Thanks for the question – I’d like to answer a different one
Our brain loves shortcuts. You have already learned about some of them. Now let’s look at another shortcut: substitution. This is a heuristic, which means a way to solve problems. It is similar to the halo effect. Substitution happens when we face a question that we can’t really answer with what we know. But to still give an answer, we replace the original question with one that is easier for us to answer.
Here’s an example: “This man is running for Foreign Minister. How successful will he be in office?” If we don’t feel skilled enough to answer, we automatically replace it with an easier question, like: “Does this man look like he could do a good job as Foreign Minister?”
Instead of learning about the candidate’s background and policies, this shortcut leads us to a much easier task. We simply compare the candidate’s appearance to our mental image of a good Foreign Minister. If the man doesn’t fit this image, we dislike him. This happens even if he has decades of experience in international diplomacy and is clearly the best candidate. So, our answer is often intuitive – that’s System 1 at work. System 2 could check it, but it’s too lazy.
Let’s look at another shortcut: the availability heuristic. With this shortcut, we overestimate how likely events are. This happens when we hear about them often or when they leave a strong impression on us.
Here’s an example: Heart attacks cause more deaths than accidents do. Still, in one study, 80 percent of people said accidents were the more likely cause of death. Why is this? Well, the media report much more often about deadly accidents than about heart attacks. Also, the terrible images of a crash leave a stronger impression on us. And so, we wrongly believe that deadly accidents happen very often.
In short: Our brain uses shortcuts, called heuristics, when making judgments and estimates.
Especially when emotions are involved, System 2 believes System 1 too easily. It looks for information that matches what it already knows. So, test yourself when you need to answer difficult questions or make statistical estimates: Are you perhaps making it too easy for yourself? You will also get some help for a better understanding of statistics in the next summary.
Blink 7 – Statistics are not natural for us
One thing is sure: We cannot rely on half-knowledge that we picked up somewhere. But how can we still make good predictions in daily life?
An effective way is to use the base rate. This is the basic statistic on which other statistics are built. Imagine 20 percent of a taxi company’s cars are yellow, and 80 percent are red. If you order a taxi and want to guess its color, the base rate helps you make a good prediction. Statistically, two out of ten taxis are yellow, and eight out of ten are red. So far, so good.
The problem is that we often forget about the base rate when making predictions. Why? Because we focus more on our feelings and expectations than on how likely something is to happen. Let’s go back to our taxi example. If you see five yellow taxis drive into the company’s parking lot one after another, an inner voice might tell you: “Another yellow one is surely coming next.” But of course, based on the base rate, it’s much more likely that the next taxi will be red.
Another common mistake in statistics is that we forget everything moves towards the average value. For example, if a football team’s striker scores five goals a month on average, but scores ten goals in September, her coach will be happy. But if she then scores only one goal in October, he will criticize her game. And that’s wrong! After all, for both months combined, she is still above her average performance.
So, the next time you need to make an estimate, use the base rate and the average.
And most importantly: Take the time you need. If most people only rely on their gut feelings, they are simply not good at statistics.
Blink 8 – The way it’s phrased matters
People are not objective judgment machines. This is very clear when we talk about assessing risks. Let’s say an event happens with a certain probability. Then our view changes depending on how the situation is described.
Let’s be specific and look at the Mr. Jones experiment. Psychiatric staff at a clinic were split into two groups. Both groups were asked the same question: “Do you think it is safe to discharge patient Mr. Jones?”
- The first group was also told: “Mr. Jones has a ten percent chance of committing a violent act.”
- The second group, however, received the following information: “Out of one hundred patients like Mr. Jones, ten are expected to become violent.”
What do you think – how did the groups decide when this experiment was done?
Only 21 percent of the first group refused to discharge him. But 41 percent of the second group did. From this, we learn that people think an event is more likely to happen if the chance is given as a clear number. It is more impactful than a statistical probability.
A related effect is denominator neglect. This happens when vivid ideas guide us. Compare these two statements:
- “This medicine protects children from disease X. But there’s a 0.001 percent chance it will cause them to be disfigured.”
- “One child out of 100,000 will be permanently disfigured after taking this medicine.”
Even though both sentences say the same thing, the second one creates a mental image of a disfigured child. This makes it sound more frightening.
So, let’s remember: How we judge risk strongly depends on how the chance of something happening is presented.
In the next summary, we will look at another source of error that affects our judgment: our memory.
Blink 9 – The effect of memorable moments
Have you ever been in a relationship that ended with a loud break-up? If yes, do you also feel, when you think about it, that you were never really happy anyway?
Beware! Your memory might be playing tricks on you. Your brain does not remember your experiences very accurately. You could say you have an experiencing self and a remembering self:
- The experiencing self feels what you are feeling right now. It asks: “How does it feel right now?”.
- The remembering self, however, looks back at the whole event later. It asks: “How was the event overall?”.
The experiencing self observes what happens much more precisely. But we cannot feel emotions later with the same accuracy as when they happen. The remembering self is therefore much less accurate. Yet, it controls our memory.
To be exact, two features of our remembering self distort what we experienced. First, there is ‘duration neglect.’ We often ignore how long an experience lasted in our memory. Second, there is the peak-end rule. This describes how our remembering self gives too much importance to the end of an event.
These two factors can be seen clearly in a scientific study. It involved two groups of people who had to remember a colonoscopy. Patients in the first group had a long and very painful procedure. But the pain lessened towards the end. The procedure for the second group was much shorter. But it ended with very great pain.
You might think the patients in the first group went through worse. They had to bear pain for a longer time overall. And in fact: When patients reported their pain every minute during the process, their experiencing self judged it as stronger than the comparison group. Later, however, when the remembering self replaced the experiencing self, the first group’s procedure didn’t seem so bad because of its relatively painless end. Patients after the shorter procedure rated the pain experience as more intense than those in the first group.
And now back to Kahneman’s systems, specifically to the intuitive System 1. This system manipulates our memory by focusing on typical moments. These are moments that stand for a longer experience in our memory.
So, this means: Our memory plays tricks on us. Our memories only show a part of our experiences.
So, maybe that broken relationship wasn’t as bad as it seems today.
Blink 10 – We don’t always choose what gives us the most benefit
You just learned that our memories can trick us. This not only affects how we judge the past. It also influences our future decisions.
To understand how striking this finding is, let’s look at the history of science. Many economists used to believe in the utility theory. This means they thought our decisions were mostly based on logical reasons. They believed people looked at all the facts. Then they would choose the option that led to the best overall result for them, meaning the most benefit.
Especially those from the Chicago School of Economics, like Milton Friedman, said that people in the market act logically. They called these people Econs. These people analyze all goods and services based on their logical needs. In other words: Econs do not have System 1!
But in reality, things are very different. Let’s take the choice between two cars. One has an efficient engine and good safety features. The other looks great. But all car magazines say it will cause problems over time because of manufacturing issues. According to utility theory, we should rate the first car’s value higher than the second’s, based on its true worth.
But our brain uses shortcuts that allow quick judgments. This often leads to wrong decisions. For example, it can be fooled by looks and lets us choose the second car.
In summary, economists have underestimated how often people act illogically.
So, if the Chicago utility theory is wrong, and people are not Econs, how can we explain and describe our behavior? The last summary is about this.
Blink 11 – (Time to take a breath)
Quote from the book: “The decisions people make for themselves can certainly be called mistakes.”
Blink 12 – Losing hurts
Daniel Kahneman developed an alternative to utility theory. It is called New Expectation Theory or Prospect Theory. This theory includes the illogical side of our decisions. Let’s look at two situations to understand this:
- In the first situation, you get 1000 Euros. Now you must decide: Either you get another 500 Euros cash, so you have 1500 Euros. Or you get a 50-50 chance for another 1000 Euros. This means you will end up with either 2000 Euros in total or stay at 1000 Euros.
- In the second situation, you get 2000 Euros. Here, you also have to choose again. Either you accept a loss of 500 Euros and stop, so you go home with 1500 Euros. Or you get a 50-50 chance to lose 1000 Euros. This means you will end up with either 2000 Euros or 1000 Euros.
Let’s summarize again: In both situations, you have two options. The safe option is that you get 1500 Euros. The risky option is that you get either 1000 or 2000 Euros.
If our decisions were always purely logical, we would choose the same for both situations. But this is not the case! In the first situation – where money is added – most people choose the safe bet. In the second situation – where money is taken away – most people choose the risky option. But why?
Kahneman’s Expectation Theory helps explain this behavior. It names three reasons for such illogical decisions:
- We base our judgments on reference points. If you put your toes into 15-degree water in summer, the water feels cold. If you do the same in winter when the air temperature is below freezing, the water feels quite warm. The same applies to the situations before: In the first situation, you start with 1000 Euros. In the second, you start with 2000 Euros. So you have different reference points. If you start with 1000 Euros, 1500 Euros feels like a gain. If you start with 2000 Euros, 1500 Euros feels like a loss.
- Our perception is affected by the principle of diminishing sensitivity. This means the perceived value is not always the same as the real value. If you lose ten Euros and still have 100 left, it doesn’t hurt as much as if you had zero Euros after the loss. It’s similar to the example before: The felt value of losing money seems higher if we drop from 1500 to 1000 Euros than if we drop from 2000 to 1500.
- We have loss aversion. The human mind seems to work in a way that it fears losses more than it values gains.
These three reasons clearly show that the New Expectation Theory can better explain why people often act illogically.
These research findings suggest that people often make illogical decisions. Kahneman says the government should therefore help people. And this often happens already. For example, helmet laws take away people’s choice to let their hair blow in the wind or to stay safe.
Conclusion
Our thinking is guided by two systems. The first system works automatically, intuitively, and uses little energy. The second system works on a conscious level and requires much more mental effort from us. To save energy, our brain often uses the first system and its shortcuts. But these shortcuts can mislead us. We make mistakes because we trust the quick answers of the intuitive system too much.
So, the next time you have to make a very important or complex decision, remember this: If you only rely on your gut feeling, you might miss important details. So, take the time to question your own bias carefully. Look at the facts and activate your System 2. In short: Use logical thinking against emotions.
We would love to hear what you think of our summaries.
Just send an email to [email protected] with Thinking, Fast and Slow as the subject. Share your thoughts with us.
To read more: Predictably Irrational by Dan Ariely
Why do we decide to lose weight, but then can’t resist sweets? Why does a mother get offended if her son offers to pay her for cooking? Why do painkillers work better when patients think they are expensive? Predictably Irrational (2008) shows why much of our behavior is illogical – and what we can do about it.
Source: https://www.blinkist.com/https://www.blinkist.com/de/books/schnelles-denken-langsames-denken-de