Thinking fast and slow by Daniel Kahneman - Book Summary



In this book, Daniel Kahneman presents the results of his research work of decades which helps us understand what goes on inside our heads and how we process the happenings around us and ultimately make decisions. In this book summary, we will explain three critical ideas presented by Kahneman to explain the process of the human mindset. The first point is how our brain defines the two cognitive systems. Second is our mental heuristics, and third is how to make decisions. 

The first part, thinking fast and slow: The two systems.

Our brain works through two systems. First is fast, which is the unconscious mind or involuntary mind. It operates automatically and runs continuously. It works intuitively and effortlessly, e.g., while driving. The second is slow. It is used when necessary. In situations where we need to analyze or solve a problem, it takes time, unlike system one, which is involuntary and takes reflex action. The results of system two take time and effort, that’s why the results of system two are more reliable. Kahneman has also given the idea of the “Law of least effort,” which explains that our mental capacity decreases with use; hence we go for the oath of less resistance. Thinking fast and slow are working together to help us decide on events. 

The second concept described in the book is mental heuristics. Our mind system uses heuristics or shortcuts to process many tasks we do daily. As these are fast decisions, they are mostly unreliable and also make us prone to be influenced by others’ ideas. For example, in a group of people, if they are asked to smile first, they will find the joke funnier than the group who were not asked to smile. The upset person will feel songs more profoundly and sadly than an average person. It means associated ideas will evoke more ideas and feelings. This is called the associative mechanism. System one operates using shortcuts, and it is the responsibility of system two to analyze and interpret the results of system 1. Still, system two does not respond quickly, resulting in intuitive biases and errors in our judgment. 

Next, Kahneman moves to explain the different negative results caused by the heuristics of system 1. The first negative impact he explains, is heuristics cause overconfidence. We feel confident when our stories seem coherent and are at cognitive ease. However, we become overconfident in several areas. For example, with overconfidence in perspective, we understand the world through narrative fallacy, which is how we create stories to explain the past, which shapes our views about the world and perception of the future. Then we become overconfident in our expertise. We become so sure about our skills and intuitions, which are only valid sometimes. Most of the time, we become over-optimistic, and we take risks which Kahneman explains as planning fallacy syndrome and also describes how we should balance our decisions by taking an outside view.

The third point discussed in the book is how to make better choices. Heuristic affects our choices because how we see and perceive the world automatically affect our decisions. The book uses different examples to explain how our decision-making process works and how our heuristics can result in less accurate decisions. 

The prospect theory explained in the book is vital as it gained the noble prize in Economics Kahneman. This theory is based on three points. First, the value of money is related to the subjective experience that comes with it. The value of wealth is attached to gain or loss. For example, having 4000 today is suitable for someone who had 2000 yesterday but bad for someone who had 6000 yesterday. Hence the same amount of money is valued differently because wealth is attached to gaining or losing experience. The second part of prospect theory says that we experience reduced sensitivity to changes in wealth. The third point, loss aversion, states that we weigh loss more than gain, which is why we prefer not to lose money than gain money. Our brains sense bad things faster than good things, which is why we work harder to avoid loss than to attain gain, and people work harder to avoid pain than to achieve pleasure. 

To summarize, our brain contains two systems: quick and slow. It is not possible to avoid the results of the system, which are fast and not reliable. Still, we can make efforts through system two, which is thoughtful and dedicated, to lessen the risks of the decisions of system one because our heuristics influence our choices which are not intelligent and accurate most of the time, so we need to process them through system 2 to make more thoughtful and fruitful decisions, especially at times when there are risks.


 Thinking Fast And Slow

Overview

The title of the book refers to two modes of thinking, which he refers to as:


"System 1" = The instant, unconscious, automatic, emotional, intuitive thinking.

"System 2" = The slower, conscious, rational, reasoning, deliberate thinking.

EXPERTISE

Expert intuition: The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition.


Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it.


Philip Tetlock's book "Expert Political Judgment: How Good Is It? How Can We Know?" - gathered more than 80,000 predictions. The experts performed worse than they would have if they had simply assigned equal probabilities. Even in the region they knew best, experts were not significantly better than nonspecialists.


People who spend their time, and earn their living, studying a particular topic produce poorer predictions than dart-throwing monkeys.


Those with the most knowledge are often less reliable. The reason is that the person who acquires more knowledge develops an enhanced illusion of her skill and becomes unrealistically overconfident.


Hedgehogs "know one big thing" and have a theory about the world; they account for particular events within a coherent framework, bristle with impatience toward those who don't see things their way, and are confident in their forecasts. They are also especially reluctant to admit error.


It is much easier to strive for perfection when you are never bored.


Flow neatly separates the two forms of effort: concentration on the task and the deliberate control of attention.

In a state of flow, maintaining focused attention on these absorbing activities requires no exertion of self-control, thereby freeing resources to be directed to the task at hand.


Many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possibl


Putting the participants in a good mood before the test by having them think happy thoughts more than doubled accuracy. An even more striking result is that unhappy subjects were completely incapable of performing the intuitive task accurately; their guesses were no better than random. Mood evidently affects the operation of System 1: when we are uncomfortable and unhappy, we lose touch with our intuitio


When in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors. Here again, as in the mere exposure effect, the connection makes biological sense. A good mood is a signal that things are generally going well, the environment is safe, and it is all right to let one's guard down. A bad mood indicates that things are not going very well, there may be a threat, and vigilance is require


Surprise itself is the most sensitive indication of how we understand our world and what we expect from i


The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in i


When System 2 is otherwise engaged, we will believe almost anything. System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often laz


Understanding a statement must begin with an attempt to believe it: you must first know what the idea would mean if it were true. Only then can you decide whether or not to unbelieve it. The initial attempt to believe is an automatic operation of System 1.

Unbelieving is an operation of System 2.


The operations of associative memory contribute to a general confirmation bias. When asked, "Is Sam friendly?" different instances of Sam's behavior will come to mind than would if you had been asked "Is Sam unfriendly?" A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis. Contrary to the rules of philosophers of science, who advise testing hypotheses by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold. The confirmatory bias of System 1 favors uncritical acceptance of suggestions and exaggeration of the likelihood of extreme and improbable events.


Herbert Simon's definition of intuition: Expertise in a domain is not a single skill but rather a large collection of miniskills.


The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trust anyone - including yourself - to tell you how much you should trust their judgment.


When do judgments reflect true expertise?


An environment that is sufficiently regular to be predictable an opportunity to learn these regularities through prolonged practice.


When both these conditions are satisfied, intuitions are likely to be skilled.


Intuition cannot be trusted in the absence of stable regularities in the environment.


If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery will recognize situations and generate quick and accurate predictions and decisions. You can trust someone's intuitions if these conditions are met.

When evaluating expert intuition you should always consider whether there was an adequate opportunity to learn the cues, even in a regular environment.


"Does he really believe that the environment of start-ups is sufficiently regular to justify an intuition that goes against the base rates?"


"Did he really have an opportunity to learn? How quick and how clear was the feedback he received on his judgments?"


The proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person's judgment.

also invite an intuitive answer that is both compelling and wrong:


Students who scored very low on this test - their supervisory function of System 2 is weak - and they are prone to answer questions with the first idea that comes to mind and unwilling to invest the effort needed to check their intuitions.


Individuals who uncritically follow their intuitions about puzzles are also prone to accept other suggestions from System 1. In particular, they are impulsive, impatient, and keen to receive immediate gratification.


What makes some people more susceptible than others to biases of judgment? Stanovich published his conclusions in a book titled Rationality and the Reflective Mind.


Superficial or "lazy" thinking is a flaw in the reflective mind, a failure of rationality.


Rationality should be distinguished from intelligence.

Previous Post Next Post