top of page
Writer's pictureJoe Lu

Thinking Fast and Slow

Updated: Sep 13, 2020


Daniel Kahneman writes about neuroscience and provides suggestions for improving how we think. Dr Kahneman is an Israeli-American psychologist most known for his work on Prospect Theory for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. He is widely regarded as one of the world's leading experts on the psychology of judgment and decision-making. In this book, published on October 25, 2011, Kahnman provides insights on how humans make decisions and behavioral biases that lead to poor decision making.


Here are our takeaways organized by the same five parts used in the book. This is more than a summary. We highlight concepts and suggestions that we believe to be the most helpful.

  1. The Two Systems

  2. Heuristics and Biases

  3. Overconfidence

  4. Choices

  5. Two Selves


Part 1: The Two Systems


People use two cognitive systems for thinking. System 1 is our subconscious intuition ... automatically absorbing information and making fast judgments based on familiar patterns. System 2 is our conscious thinking process which given some effort can help us review suggestions from System 1 methodically evaluate problems and new situations. People tend to believe that system 2 is more important, but we actually spend most of our time in System 1 which drives how we see ourselves in the world and helps us to be more creative.


“The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it.”


System 1 also holds the potential for developing expertise in handling repeated situations. Many complex tasks that we do every day we do without much effort such as driving. Experts can be described as those who have done a particular task so many times that their intuitions offer up correct suggestions at a far higher rate than none experts for tasks like diagnosing a patient, reading the mood of a potential client during a negotiation, or other tasks that allows us to learn from rapid feedback.


Using system 2 comes with biological and behavior costs. Scientists have measured this cost which shows up biologically as a drop in glucose levels. The behavior cost is an increase in negative actions. For example, those who are using more energy in system 2 are more likely to stereotype, give into temptations make selfish choices, use sexist language, and make superficial judgments. Perhaps because of these costs, humans have evolved to take mental shortcuts called Heuristics to avoid using mental effort leaving system 1 as our default mental state. System 2 monitors suggestions from system 1, but is only lazy, relying on quick intuition rather than deep and thorough investigation.


Persuasion techniques appeal to system 1 because of its dominant role. That is why headlines and commercials are simple, memorable, and employ rhymes and repetition. People are more likely to be persuaded when they observe information that is consistent with their own world view.


“A compelling narrative fosters an illusion of inevitability.”


Our world view is extrapolated from tiny bits of information that is often biased by the uniqueness of our life experience. System 1 further compounds this bias by anchoring to it and filtering out future information that is inconsistent with the past. Narratives and simple explanations help us to remember our world view, but simultaneously over simplify reality. As a result, our world view is far simpler and deterministic than reality.


"Simple explanations play into our desire for control and high self-esteem."


System 2 can theoretically serve as an antidote to these biases, but instead often magnifies the bias by finding reasons to support ones world view through Confirmation Bias. Simple explanations play into our desire for control and high self-esteem. Accepting complexity and "not knowing" is challenging and requires a great deal of courage. Openly not knowing comes at a risk that others may believe us simple or lacking in conviction. Changing of views publicly can undermine our credibility. These risks are hard wired into our psyche.


“Facts that challenge...basic assumptions – and thereby threaten people’s livelihood and self-esteem – are simply not absorbed.”


System 1 absorbs information with little thought to its quality and turns it into the impressions, intuitions, and beliefs. These in turn drive our actions. We habituate to what is normal screening out everyday information like faces on our commute to work. We pay more attention to what is new, especially if it makes us scared or angry do to the role these emotions played in keeping us alive over millions of years. As a result, very unimportant and unrepresentative information like a child abduction in another country can lead to extreme actions like not ever letting a child attend a sleepover or post a family picture online.

We can learn to better coordinate our two cognitive systems by understanding how they work. To start, we explore common heuristics and biases.


Part 2: Heuristics and Biases


Law of Small Numbers - Assuming that the small amount of data you have resembles the total population. Gathering more information is cognitively expensive and so System to will often lazily extrapolate from limited experiences when making decisions. That's why first impressions and experiences are so important.


Anchoring Bias - Allowing an initial value to have an out-sized influence on your prediction of future values or estimate of a quantity. For example, if you starting watching a stock at a price of $50 you are likely to view prices above and below $50 as arbitrarily "expensive" and "cheap".


Availability Bias - Allowing your beliefs about the probability of an event to be influenced by the ease with which you can think of examples. People overestimate the probability of extremely unlikely events like kidnappings, plane crashes, and shark attacks because these events are horrifying and therefore easy to remember from the news and visualize.


“We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario. But ease and coherence do not guarantee that a belief held with confidence is true.”


Map is not the territory - Allowing our narratives to mask life's complexity. We all create mental models for how the world works and how to make decisions, but we often conveniently forget all models are wrong because we like to feel all-knowing and in control.Tom W’s Specialty - Allowing ones own priors to override base rate probabilities. For example, stocks in the United States have tended to outperform treasury bonds by an average of 6% a year (base rate), but an investor may speculatively sell stocks for bonds short term because they believe they know something others don't (prior).


Less is more - Allowing persuasive details of a scenario to increase our perceived likelihood of an event. For example, you may believe the probability of higher taxes after the next election to be 20%, but if asked about the likelihood of a specific candidate raising taxes after hearing specific details of their platform you might state the probability even higher only because its easier to visualize the scenario.


Examples vs Facts - Allowing examples to have more influence on our beliefs than than facts and statistics. Kahneman points out that this is why politicians tend to emphasize examples instead of statistics.


Regression to the Mean - Within populations, higher or lower performance tends to not persist. For example, a basketball player that recently made several shots in a row is not more likely to than their base rate to make their next shot.


Base rates with adjustments - Kahneman suggests that we can typically improve our predictions by starting with our base rates and applying adjustments supported by empirical evidence. This process can be further improved by recognizing sources of overconfidence and tools to avoid it.


Part 3: Overconfidence


“Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.”


The Illusion of Understanding - Allowing ourselves to believe we understand the reasons for historical events. We do this through narratives that fit some of the historical data and ignoring data not consistent with the narrative. This leads us to a belief that we can accurately predict the future.


“The idea that the future is unpredictable is undermined every day by the ease with which the past is explained.”


The Illusion of Validity - Allowing a feeling of confidence in a judgment based on its coherence to determine if the judgment is valid rather than an objective measure of accuracy such as unbiased data sampling and analysis.Intuitions vs.


Formulas - Kahneman recommends using simple formulas and other structured reasoning over intuition when feasible.


Expert Intuition: When Can We Trust It? - Many years of practiced experience can lead to improved intuition, but only under normal circumstances consistent with hat experience. In other words, don't trust yourself or experts when operating outside a field of expertise.


“Organizations that take the word of overconfident experts can expect costly consequences.”


Think in probabilities - Always plan using a probabilistic view of the world. People tend to rely to heavily on base cases. Instead, make decisions that lead to better outcomes over many possible scenarios.


In general, overconfidence is driven by a suppression of doubt. In addition to the above strategies it can be helpful to have a "premortem" discussion of doubt before making big decisions in order to raise and legitimize doubts.


Part 4: Choices


In part 4 of this book, Kahneman lays out several concepts that are useful when making choices and avoiding errors.


“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down and ask for reinforcement from System 2.”


Bernoulli’s Errors - Economists tend to embrace utility theory which posits that people and businesses make decisions by maximizing our utility (i.e. happiness). However, there is substantial evidence supporting the view that we instead tend to view decisions in terms of the gains or losses we will experience. In other words, we care more about the impact of a decision on our reference point (current situation) rather than some absolute level of utility.


Prospect Theory - Kahneman won the Nobel Prize in part because of his "Prospect Theory" which argues that we humans feel more pain from a loss than we experience joy from an equally large gain. This can be rational in the context of catastrophic losses; which is why we buy insurance against events like our house burning down. However, it can be irrational in repeated events like the stock market where investors might avoid stocks for years out of fear that on any one day they may lose a lot of money.


Endowment Effect - People irrationally value things we own more than things we don't. For example, in an experiment where students were randomly given various objects that student with the objects demanded more money for their objects than other students were willing to pay. This is one reason why investors may not be willing to part with investments even when they believe them to be inappropriate for their portfolio.


Bad Events - Humans tend to exert more effort avoiding losses than seeking gains. This was rationale during millions of years of evolution when there was little to be gained from risk taking and making a mistake often resulted in death. Today, however, this focus on avoiding risks often leads to missed opportunities.


The Fourfold Pattern - Humans tend to be risk averse when it comes to positive outcomes, but can often be risk seeking when it comes to negative outcomes. This is one reason investors that have recently lost a lot of money are more likely to take risk.


Rare Events - Humans are bad at dealing with unlikely events. We tend to over their probabilities which causes us to overweight them in our decisions. For example, parents may avoid having their kids go to sleepovers or allow them to use a social network due to the unlikely event of some disaster, but ignore the more likely benefits from socializing and technological fluency.


Risk Policies - Taking a broader view of decisions can help us to be more constructive when it comes to risk taking. Focusing on just one decision or one risk can lead to tunnel vision and focus on loss aversion. Viewing each decision as a string of many decisions can help view risk taking as a portfolio with negative outcomes outweighed by positive ones.


Keeping Score - Humans are not good at objectively measuring the benefits and costs of a decision. This allows opens to the door to the Agency Problem, Sunk Cost Fallacy, and Disposition Effect.

  • Agency problems occur when the one in charge (agent) has incentives that differ from those she represents such as when a CEO wants to make a big acquisition so she can control a larger company even if the acquisition is not profitable.

  • Sunk-cost fallacies occur when we keep investing additional resources into a failed endeavor because we don't want to acknowledge that waste of our initial investment such as sitting through a bad movie because we already bought the ticket.

  • Disposition effects occur because we want everything we do to end well such as the tendency of to sell winners even though for tax purposes its wiser to sell losers.


Regret avoidance is a powerful force behind irrational behavior. Humans are more likely to have feelings of regret from an error of commission (doing something) than omission (not doing something). This leads us to disproportionately remain docile and not try new things. One way to avoid regret is to confront it directly and anticipate it while making decisions. Another way is to document your decision making process so that you can more easily accept failures when you know they were at least supported by a calculated risk. Finally, remember that people tend to fear more regret then they actually experience when confronted with failure.


Framing has a big impact on how we perceive reality. For example, when we view a single problem we tend to exaggerate its importance and potential risks. It therefore helps to zoom out and frame problems in their broader context. Framing can also be seen in our bias toward inaction. For example, People are more likely to donate an organ or invest in their 401k if that is the default option (perhaps because if wrong this is perceived as an error of omission).


Part 5: Two Selves


Two selves govern how humans experience and remember. Our remembering self keeps score of our past experiences, but our ratings are subject to a variety of biases.

“The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from experience, and it is the one that makes decisions.”


Peak-end rule is a heuristic that leads humans to judge an experience based on how they felt at its peak and at its end, rather than the average of every moment. As a result we tend to overrate the end of an experience when scoring the whole.


Duration neglect is the tendency of humans to ignore how long experiences are when scoring. This result is supported by experiences of patients which prefer long and mildly painful treatments over faster but more uncomfortable ones.


Life as a story told by a series of experiences. That's why humans tend to care more about their remembering self instead of their experiencing self. We collect memories in order to weave the narrative of our lives. Humans want that narrative to be consistent and admirable and so our self-evaluations tend to be more glowing and clean than our actual experience.


“Confusing experience with the memory of it is a compelling cognitive illusion.”


Experienced Well-Being can be greatly improved by substituting passive leisure (TV) with active (exercise) and spending more time with friends and relatives. Money tends to only have a strong effect on happiness for people in poverty. Studies suggest that households making more than $75,000 in high-cost areas (adjusted for inflation since 2010) experience no measurable improvement in happiness when their incomes increased.


Circumstances are overrated ... our actual situation has very little impact on reported life satisfaction which is largely determined by genetics and temperament. Baseline happiness is heritable just like height and intelligence as supported by studies of twines separated at birth. Marriage also appears to have very little impact on reported well-being for the same reason. Marriage improves some aspects of life but makes others less pleasant with the net effect being immaterial on average.


Avoid setting too many or unobtainable goals. Achieving an important goal can lead to higher levels of happiness. For example, those who made it a priority to try to make more money became happier when they did, but those who failed to make more money were much less happy. However, setting many goals is detrimental because you set yourself up for perpetual failure. Some goals are better than others. For example, one study found that out of a broad set of goals, "becoming accomplished in a performing art" provided the least amount of satisfaction over a 20 year period.


"You rate your life by standards or goals you set."


Focusing illusion is the tendency to overweight the importance of whatever we think about. This is related to the importance of framing problems and decisions in their broader context in order to avoid exaggerating the consequences and potential risks. One example of this is Miswanting which is the human bias toward buying things. Shopping leads us to buy things we don't need instead of the other way round because when we see the product it becomes more important to us than it will be the moment we stop thinking about it.

“Nothing in life is as important as you think it is when you are thinking about it.”


Conclusion


Daniel Kahneman's "Thinking Fast and Slow" provides an excellent overview of why we should be skeptical of our intuition and ways we can improve our own decision making process. Our takeaways focus on the strategies and concepts that we believe to be the most helpful. The book includes references to many experiments and examples that we were not able to include in our summary. We encourage you to read the book (Amazon link). If you don't have time to read the book but would like more detail we also recommend this hour long interview sponsored by Google in which Dr Kahneman focuses on the most important concepts and supports his conclusions with experiments. He also contrasts his views with other famous authors and thinkers like Malcolm Gladwell who takes a more favorable view of human intuition in his book Blink which we also enjoyed.


Thank you for your interest!


If you enjoyed our summary please share it with your friends!


WEquil Team


Comments


bottom of page