Overrated Books: Thinking, Fast and Slow

June 19, 2025

It's possible that Daniel Kahneman's Thinking, Fast and Slow is the book that I've heard referenced the most in the past decade. Authors and podcasters of various stripes have mentioned it or recommended it. It's the top entry on the Alliance for Decision Education's book list. Michael Lewis wrote a biography of Kahneman and his academic partner Amos Tversky in The Undoing Project, which I enjoyed. So I came into Thinking, Fast and Slow with high expectations, but was left disappointed.

Some of my criticisms of the book are:

  • Much time is spent on concepts like priming that were based on weak experiments.
  • Even for well-run experiments, I'm frequently unsure what conclusions we should draw from them.
  • Many examples reveal risk-aversion in decision-making, but risk-aversion is understandable and often good.
  • Various specialized terms in the book seem to just describe common sense concepts.

Weak Studies

Kahneman goes through numerous examples of priming in the front half of the book. The idea behind priming is that exposing someone to an initial stimulus can influence how they behave later. For instance, if you first ask someone whether the height of the tallest redwood is more or less than 1,200 feet, and then you ask them to guess what the height of the tallest redwood actually is, their guess will often be influenced by the earlier mention of "1,200 feet". When that initial number is high, their guess for the tallest tree tends to be high. When that initial number is low, their guess tends to be low. This is known as the anchoring effect.

It could be useful to be aware of the possibility of this kind of anchoring effect. But there are other priming experiments that stretch the limits of credulity.

For example, there's the "Florida effect" where students were asked to assemble sentences from a set of words, and then after completing that task, they were sent down a hall. Those who were exposed to words that had an "elderly" theme to them walked down the hallway "significantly more slowly" than other students. Unfortunately, this has failed to replicate.

Kahneman describes another experiment where participants were presented with the word fragments W_ _ H and S_ _ P and they are asked to fill in the blanks:

People who were recently asked to think of an action of which they are ashamed are more likely to complete those fragments as WASH and SOAP and less likely to see WISH and SOUP. Furthermore, merely thinking about stabbing a coworker in the back leaves people more inclined to buy soap, disinfectant, or detergent than batteries, juice, or candy bars. Feeling that one's soul is stained appears to trigger a desire to cleanse one's body, an impulse that has been dubbed the "Lady Macbeth effect."

Likewise, this has failed to replicate.

Then there's the example of coffee that was sold in an office kitchen. Purchases were made on the honor system, paid into an unattended payment box. When the price list included a picture of staring eyes contributions were greater than when the price list included a picture of flowers. As you may have already guessed, this "watchful eyes effect" has also failed to replicate.

As Jesse Singal points out in his book The Quick Fix, studies like these ask us to "forget much of what we already know about human behavior and decision-making." He writes:

There was a period—a brief one—when some of the smartest people on the planet simultaneously believed social-priming effects were miraculous and that they were robust and well-founded. In 2011, the Nobel Prize-winning behavioral economist Daniel Kahneman wrote about the power of social priming in Thinking, Fast and Slow, his vital book on the difference between deliberative, rational cognition and cognition based more on gut impulses or hasty assessments: "When I describe priming studies to audiences, the reaction is often disbelief." A paragraph later: "The idea you should focus on, however, is that disbelief is not an option. The results are not made up, nor are they statistical flukes. You have no choice but to accept that the major conclusions of these studies are true."

But disbelief was an option. The studies weren't true. And since the publication of Thinking, Fast and Slow, Kahneman himself has admitted that he put to much faith in low sample sizes:

... I placed too much faith in underpowered studies. ... there is a special irony in my mistake because the first paper that Amos Tversky and I published was about the belief in the “law of small numbers,” which allows researchers to trust the results of underpowered studies with unreasonably small samples.

How Much Should we Conclude from Experiments?

There's a famous psychological experiment about the "invisible gorilla". Participants watch footage of teams passing basketballs. They are asked to count the number of passes made between members of the team wearing white shirts. During the video a woman in a gorilla suit walks across the court. About half the people who watch the video don't notice the gorilla at all. Kahneman notes about the experiment:

Intense focusing on a task can make people effectively blind, even to stimuli that normally attract attention.

But many of these kinds of experiments have different results when the conditions are adjusted slightly. In recent iterations of the experiment, the speed of the gorilla was changed. Participants "were more likely to spot the NYU gorilla if it was moving substantially faster than in the original 1999 experiment or if it was leaping instead of walking."

Additionally, something that is regularly left out of summaries of the original experiment is that when participants were asked to count the number of passes made by the team wearing black shirts instead of the white shirts, then almost everyone notices the gorilla.

While it's true that intense focus can make us blind to stimuli, this is significantly affected by differences in color or speed. Does this kind of experiment tell us more about how intense focus blinds us or more about how our eyes evolved to perceive change? Anyone who has ever tried to locate an animal hidden within a landscape knows that this is much easier to do if the animal is moving quickly or has a specific color you can seek out.

Risk Aversion

There are many examples in the book related to how people chose between a gamble and a sure thing that are of this nature:

Which do you prefer?
A. Toss a coin. If it comes up heads you win $100, and if it comes up tails you win nothing.
B. Get $46 for sure.

Most people favor the certainty of $46 even though it is less than the $50 expected value of the gamble. Or:

You are offered a gamble on the toss of a coin.
If the coin shows tails, you lose $100.
If the coin shows heads, you win $150.

Many people don't like this gamble even though the expected value is $25. People tend to be loss averse and the size offered by the win needs to be significantly larger than the loss in order for people to be comfortable with the gamble. On average the win needs to be 1.5 to 2.5 times the loss.

There were countless variations on these types of choice-making scenarios in the book, to the point of becoming a little mind-numbing. The behavior uncovered by these types of experiments is presented as surprising outcomes that expose how irrational our decision-making can be. But I found many of the results unsurprising and completely understandable. As Nate Silver observes in his book On the Edge:

Something like the aversion to financial losses that Kahneman and Tversky described makes sense, for instance, when you consider that humanity has spent most of its existence at a subsistence level; it's harder to take risks when you don't have a safety net.

Trying to optimize for expected value is a good strategy if you're a poker player or stock trader and you're going to be able to make a large number of bets over and over. It makes less sense when dealing with life-or-death situations where a small number of losses could be fatal.

News stories about limited financial savings are perennial. One recent survey found that most Americans can't afford a $1,000 emergency expense. It makes sense that the average person would be highly sensitive to losses when they could lead to ruin.

I'm not sure whether these choice experiments actually provide valuable lessons for good decision-making. In real life, most decisions aren't as simple as this. Even if a choice is analogous to these experiments, there are usually many additional risks to consider, such as:

  • How accurate is my estimate of the odds?
  • Will the terms be honored?
  • Will the terms change over time?
  • How will my dependents be impacted if a loss occurs?
  • Are there different short-term and long-term benefits?
  • Are there more options than what I've been presented with?
  • Is there a way to hedge against the potential loss?

Unnecessary Jargon

I know there's value in a specialized field coming up with terms that provide a shorthand for large concepts. But in psychology, old concepts are routinely given new names to sound important or for marketing purposes.

One of the core concepts in the book is "System 1" vs. "System 2" thinking:

... I describe mental life by the metaphor of two agents, called System 1 and System 2, which respectively produce fast and slow thinking. I speak of the features of intuitive and deliberate thought as if they were traits and dispositions of two characters in your mind.

I really struggled to see how these arbitrary names "System 1" and "System 2" were better than the other words used in the definition above. "Fast vs. slow thinking" or "intuitive vs. deliberate thinking" both seem much easier to understand.

A lot of concepts seem to boil down to human behavior that society has been familiar for a long time. There's plenty of common sense expressions that hint at this.

The "endowment effect" describes how people overvalue items that they own. An example given in Thinking, Fast and Slow is that if you own a bottle of wine, and you wouldn't pay more than $35 for a similar bottle in a store, but you would require significantly more than $35 to part with it, this is a form of economic irrationality. But being careful with your resources is a good general practice and unsurprising behavior. Old adages like "waste not, want not" and "a bird in the hand is worth two in the bush" show how we recognize that sometimes it turns out to be more difficult than we thought to gain (or regain) something we don't have. Even something as simple as replacing a bottle of wine will incur additional small cumbersome costs like having to go to the store, find the bottle, wait in line, and then bring it home.

"Ego depletion" describes how people are more likely to give into temptation if they were recently challenged mentally. In one experiment from the book, subjects are inclined to pick "a sinful chocolate cake" over "a virtuous fruit salad" because they were just taxed with having to memorize 7 digits for 2 minutes. Again, this is an area of study that has had replication problems. But it's no shock that people may have less willpower when fatigued. People don't even need to be worn down to be enticed by a piece of cake sitting right in front of them. This is why proverbs like "out of sight, out of mind" recommend removing alluring objects from our senses altogether.

The core theme of the book, that fast, intuitive thinking often causes mistakes is succinctly expressed in age-old sayings like "look before you leap", "patience is a virtue", "haste makes waste", and "sleep on it".


References
Scrolling Up and Down Logo

Copyright Chad Schroeder © 2025