Thinking Fast and Slow was written by Daniel Kahneman, the Nobel Prize winner in economics, to integrate economic science with human behavior, judgment, and decision making. The book summarizes decades of Kahneman's research in cooperation with Amos Tversky. Kahneman was born in Tel Aviv in 1934 and emigrated with his parents to France. During World War II, his father was captured in Paris by the Nazis. His lawyer intervened, and he was one of the few lucky ones to escape the holocaust. After the war ended in 1945, psychologists started looking for an explanation for how humans could have the ability to commit all that ferocious violence. Kahneman himself had met a nazi soldier. The soldier stopped him, carried him, showed him his little son's picture, and gave him money. But, of course, the officer did not know that a seven-year-old boy running in the street was Jewish. The event shows Kahneman that the nazi soldier is another human being who loves his family and kids. The coincidence made him more curious to test the claim that we are rational agents regarding judgment and decision-making. The book style makes the readers become active participants in the experiments and make the same biases and mistakes. The book brings the work of other researchers and scientists and makes it acceptable to ordinary people, lets them reflect on their thinking, and shows them how irrational and inaccurate thinking affects almost all aspects of life. In the introduction, Kahneman hopes that his work will enrich people's vocabulary when talking about judgment and decisions and represent a picture of how the mind works that draws on recent developments in cognitive and social psychology. Much of the book discussion is about the biases of intuition and represents a contemporary understanding of the splendors and flaws of intuitive thought. It shows that our confidence in our intuitive beliefs is usually justified, but not always. We are often confident when we are wrong, and the other person is more likely to notice our errors than we are.
Kahneman's central idea is the dichotomy of two modes of thinking: System 1, which operates automatically and quickly, and System 2, slower, less emotional, more logical, and often associated with the subjective experience of agency. We evolved to perceive the world around us, realize objects, orient attention, fear spiders, and System 1 is capable of these innate skills we share with other animals. However, through prolonged practicing and learning, other skills become fast and automatic, like driving and reading. System 1 also learns the associations between ideas; for example, when you hear the phrase: The capital of France, you can't prevent yourself from thinking of Paris. The associated idea comes automatically. Likewise, if you are in front of a simple sentence in your language, you can't refrain from understanding it. The knowledge is stored in the memory, and access to it happens effortlessly without voluntary attention. On the other hand, to count the occurrences of the letter a in a page of text or compare two vacuum machines for overall value, in such situations, you must pay attention, and you will perform poorly or not at all if your attention is distracted. The common feature of the two systems relies on the limited sources of attention. We are sometimes aware of our limited attention capacity, and we know if we go beyond the budget, we may fail. For example, when a driver is overtaking another car on a narrow road, we in the passenger seat suddenly stop talking. First, we know that distracting the driver is not a good idea, and second, we suspect she will not hear what we say. Intense focusing can make people blind even to stimuli that usually attract them. In the gorilla study, Christopher and Daniel Simon made a short film of two teams passing basketballs, one team wearing white and the other wearing black. The viewers' task was to count the number of passes made by the white team and ignore the black players. A woman in a gorilla suit appears for 9 seconds in the middle of the video. Thousands have seen the film, and about 50% do not notice the woman in the gorilla suit, which no one will miss if they watch the video without the task. Kahenam says, "the gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness."
System 1 is the basis of intuitive judgment and often the source of our choices and actions. It connects the present with the recent past and with the expectations about the near future. It is also the origin of fallacies, illusions, and heuristics.
A heuristic is a mental shortcut that assists us in solving a problem, answering a question, and making decisions as fast as possible. Heuristics are helpful in our daily life, but they can lead to errors and fallacies. one of the heuristics the book explains is the affect heuristic when people judge and make decisions by consulting their emotions. Subsequently, judgments and decisions are guided more by the feeling of liking and disliking than reasoning and deliberation. In the affect heuristic, the response to a simple question (How do I feel about it?) becomes an answer to a much more complicated question (What do I think about it?). An example is when your political preferences define the argument you find compelling or when you like a project, you think its cost is low, and its benefits are high. Kahneman says laziness is built deep into our nature and proposes the law of least effort theory; the theory asserts that if there are several ways to achieve the same goal, people will eventually gravitate to the action that demands the least effort. The shortcuts like biases and heuristics enable us to think fast, and we use them and feel that our reasonable and logical mind is in control. The explanation to why we lean to shortcuts counts on the dials measures cognitive ease, and its range is between "Easy" and "Strained." Easy is a sign that everything is going well; there is no threat in the surrounding environment, no need to redirect attention or mobilize effort. Strained implies that a problem exists, which requires the mediation of System 2, and you experience Cognitive Strain. For example, when driving on a familiar empty highway, you are experiencing cognitive ease. But when trying to overtake a big truck on a narrow road, you are most likely undergoing cognitive strain. When we are in a state of cognitive ease, we are usually in a good mood; we feel familiar and trust our intuitions, like what we see and believe. On the other hand, when we are experiencing strain are more likely to be attentive and questionable and make fewer errors. Still, the cognitive strain's cost is that we are less intuitive and less creative than usual.
Our confidence in our beliefs largely depends on the coherence of the story.
The measure of success for System 1 is the coherence of the narrative it creates, while the data amount and quality in the constructed story are largely irrelevant. When information is insufficient, System 1 operates as a machine for jumping to conclusions based on limited evidence, and Kahneman uses the short form for it: WYSIAT, which stands for what you see is all there is. A basic example is getting the first impression and feeling confident with judging someone immediately with very little information about the person in front of you. Kahneman clarifies the limitation of our mind and overconfidence in what we believe we know and how much we underestimate the role of chance in events. In this topic, he relies on the work of Nasim Taleb, the author of The Black Swan. Taleb describes how flawed past stories form our views of the world and our predictions about it. Good stories deliver simple explanations of people's actions and choices as a manifestation of dispositions and personality traits.
The halo effect is the familiar bias that plays a massive role in shaping our view of people and events. It inclines us to match our idea of a person's qualities to one particularly significant attribute.
Halo effect is the tendency to like or (dislike) everything about a person, including the things we don't know; if you meet a person named Michel at a party and find him intelligent and comfortable to talk to. In the following weeks or months, what do you think if Michel's name came up as someone who could be asked to donate to a charity? You like Michel, and by the association of the feeling of liking him when you think of him and your admiration for generosity and generous people. Now, you are predisposed to believe that Michel is generous. The correct answer is that you know nothing and have no objective evidence to think that an intelligent and easy-to-talk person is also a great contributor to charity. Illustrative stories that people find persuasive are simple, tangible, and give a significant role to talents and intentions than to luck. Consider the story of how Google turned into a tech goliath. Two creative graduate students in computer science from Stanford University came up with an incredible idea of a new way to search online. They started a company and made a sequence of proper decisions. As a result, the two graduate students became among the richest globally within a few years. Of course, no one can ignore the considerable part of the skill in the Google story, but luck played a more critical role in Google's success than the story tells. On one occasion where Google's two founders were lucky, a year after establishing the company, they wanted to sell it for less than 1$ million, but the price was too high for the buyer. The single event shows the multitude of ways luck can turn things upside down. We build the best possible story from the available information, and if the story is good, we believe it. The contradiction is that it is easier to construct a coherent narrative when we have fewer pieces to fit into the puzzle.
The book's principal argument challenges the rational agent theory, which refers to approaches that help understand economic and social behavior and hypothesizes that an individual will perform rationally to determine whether a choice is right for them.
In the late 40s of the last century, after completing an undergraduate degree in psychology and serving in the Israeli Army, Kahneman was assigned to the army's Psychological Branch. One of his duties was to help evaluate candidates for officer training. Kahneman's team was utterly confident in their evaluation, and any other prediction seemed inconsistent with the evidence before their eyes. Moreover, the task seemed easy because they felt they had seen each soldier's leadership skills; some men looked like solid leaders, and others seemed like cowards. Every few months, Kahneman's team had feedback on how the candidates were doing at officer training school and could compare their assessments against the opinions of the commanders who had been monitoring them for a while. Yet, the story was always the same. The team's ability to predict the candidate's future performance was not much better than blind guesses. The evidence received in the reports from the training school, which show the fallacy of Kahneman's previous predictions, should shake his confidence in the candidate's judgment. But it did not. He continued to feel and act as if his predictions were valid, which led Kahneman to discover his first cognitive illusion, The illusion of validity. The confidence in a judgment is not sufficient to evaluate whether the judgment is correct. High confidence tells you that the person has built a coherent story, but it does not mean the story is necessarily true. The illusion of validity is widely pervasive, even in giant financial enterprises. In 1984, Kahneman visited a Wall Street firm to discuss the role of judgment biases in investing. He did not know much about finance, and he asked the host senior investment manager a simple question." When you sell a stock, who buys it?" since then, the question has set a giant puzzle, the puzzle why buyers and sellers alike think that the current price is wrong. Big businesses have appeared mainly built on an illusion of validity. Billions of stocks are traded every day, and most of the buyers and sellers know that they have the same information. They exchange the stocks primarily because they believe they know more about what the price should be than the market does. After decades of research, Kahneman says that belief is an illusion for most. And for a large majority of investment managers, the selection of stocks is more like rolling dice than playing poker.
There are two selves, experiencing self and the remembering self.
The experiencing self is the one who answers the question: "Does it hurt now?" the remembering is that answers the question: "How was it, on the whole." The experiencing self acts in the present moment; it is an unconscious mode of thinking, while the remembering self is slow and more deliberate and the storyteller of our experience. Memories are all we have to keep from our living experiences and the sole standpoint to think about our lives. For example, imagine you are listening to a 30 minutes symphony on a disc scratched near the end, causing an irritating sound. It seems like the imperfect ending "ruined the whole experience." But the experience was not ruined, only the memory of it. The experience was entirely good, and the bad few seconds, in the end, could not change it because it had already happened. Usually, we confuse experience with memory; the experiencing self does not have a voice, and the remembering self is not always correct. Kahneman and his colleagues designed an experiment where the participants immersed their hands in painfully cold water twice for 60 seconds and then again for 60 seconds plus an additional 30 seconds. The water was warmed up from 14°C to 15°C in the last 30 seconds. The participants knew that one of their experiments would be repeated precisely, and they were free to choose which one to repeat. The majority decided to repeat the longer of the two trials. The subjects who picked the long episode are not masochists, and they did not deliberately choose the more painful experience. They made a mistake. The experiment was designed to create a conflict between the experiencing and remembering selves. From the perspective of experiencing self, the long trail is worse than the short one, but the remembering self has another opinion. The participants who preferred the long episode did not intentionally choose to expose themselves to the more painful experience. They knew which of the two was longer, but the rule of intuitive choice controlled their decision; pick the option you liked the most or disliked the least. Because the cold water became progressively warmer for the last 30 seconds (the peak experience being less severe) and the hands were a bit warmer at the end (a preferred end), participants considered the average peak and end of that trial to be less intense. They then used this retrospective to direct their future choices, choosing to repeat the more extended and less severe trial. The cold hand study showed that we could not fully trust our preferences to reflect our interests. Memories that shape tastes and decisions can be wrong. That presents a profound challenge to the cornerstone of the rational-agent model; humans have consistent preferences and know how to maximize them.
If you want to read more you need to log in. Not a member of this channel, yet? Then register here.