Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 1 [607912]

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 1
Book Summary: Thinking Fast and Slow
By Daniel Kahneman (FSG, NY: 2001)
Summarized by Erik Johnson

Daniel Kahneman’s aim in this book is to make psych ology, perception,
irrationality, decision making, errors of judgment, cognitive science,
intuition, statistics, uncertainty, illogical think ing, stock market gambles,
and behavioral economics easy for the masses to gra sp. Despite his
charming and conversational style, this book was di fficult for me because I
am accustomed to thinking fast. As a service to my fellow automatic,
intuitive, error-making, fast thinkers I offer this simple (dumbed down)
summary of what is a very helpful book. Writing thi s summary taught me how to think
harder, clearer, and with fewer cognitive illusions . In short, how to think slower. Now if
only I’d do it.

INTRODUCTION

This book is about the biases of our intuition. Tha t is, we assume certain things
automatically without having thought through them c arefully. Kahneman calls those
assumptions heuristics 1 (page 7). He spends nearly 500 pages listing examp le after
example of how certain heuristics lead to muddled t hinking, giving each a name such as
“halo effect,” “availability bias,” “associative me mory,” and so forth.” In this summary I
list Kahneman’s heuristics to a list of errors of j udgment.2

PART ONE: TWO SYSTEMS

CHAPTER ONE: THE CHARACTERS OF THE STORY

Our brains are comprised of two characters, one tha t thinks fast, System 1, and one that
thinks slow, System 2. System 1 operates automatica lly, intuitively, involuntary, and
effortlessly—like when we drive, read an angry faci al expression, or recall our age.
System 2 requires slowing down, deliberating, solvi ng problems, reasoning, computing,
focusing, concentrating, considering other data, an d not jumping to quick conclusions—
like when we calculate a math problem, choose where to invest money, or fill out a
complicated form. These two systems often conflict with one another. System 1 operates
on heuristics that may not be accurate. System 2 re quires effort evaluating those
heuristics and is prone error. The plot of his book is how to, “recognize situations in
which mistakes are likely and try harder to avoid s ignificant mistakes when stakes are
high,” (page 28).

1 Synonyms include “rules of thumb,” “presupposition s,” “cognitive illusions,” “bias of judgment,” “thi nking errors,”
“dogmatic assumptions,” “systematic errors,” “intui tive flaws.”

2 Kahneman did not number his list but I will do so for ease of understanding, citing page numbers as I go. My paragraph
summaries are clear but I of course encourage inter ested readers to go to the book itself to read up o n each heuristic in more
detail.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 2 CHAPTER TWO: ATTENTION AND EFFORT

Thinking slow affects our bodies (dilated pupils), attention (limited observation), and
energy (depleted resources). Because thinking slow takes work we are prone to think fast,
the path of least resistance. “Laziness is built dee p into our nature,” (page 35). We think
fast to accomplish routine tasks and we need to thi nk slow in order to manage
complicated tasks. Thinking fast says, “I need groc eries.” Thinking slow says, “I will not
try to remember what to buy but write myself a shop ping list.”

CHAPTER THREE: THE LAZY CONTROLLER

People on a leisurely stroll will stop walking when asked to complete a difficult mental
task. Calculating while walking is an energy drain. This is why being interrupted while
concentrating is frustrating, why we forget to eat when focused on an interesting project,
why multi-tasking while driving is dangerous, and w hy resisting temptation is extra
hard when we are stressed. Self control shrinks whe n we’re tired, hungry, or mentally
exhausted. Because of this reality we are prone to let System 1 take over intuitively and
impulsively. “Most people do not take the trouble t o think through [a] problem,” (page 45).
“Intelligence is not only the ability to reason; it is also the ability to find relevant
material in memory and to deploy attention when nee ded,” (page. 46). Accessing memory
takes effort but by not doing so we are prone to ma ke mistakes in judgment.

CHAPTER FOUR: THE ASSOCIATIVE MACHINE

Heuristic #1: PRIMING . Conscious and subconscious exposure to an idea “p rimes” us
to think about an associated idea. If we’ve been ta lking about food we’ll fill in the blank
SO_P with a U but if we’ve been talking about clean liness we’ll fill in the blank SO_P
with an A. Things outside of our conscious awarenes s can influence how we think. These
subtle influences also affect behavior, “the ideomo tor effect,” (page 53). People reading
about the elderly will unconsciously walk slower. A nd people who are asked to walk
slower will more easily recognize words related to o ld age. People asked to smile find
jokes funnier; people asked to frown find disturbin g pictures more disturbing. It is true:
if we behave in certain ways our thoughts and emoti ons will eventually catch up. We can
not only feel our way into behavior, we can behave our way into feelings. Potential for
error? We are not objective rational thinkers. Thin gs influence our judgment, attitude,
and behavior that we are not even aware of.

CHAPTER FIVE: COGNITIVE EASE

Heuristic #2: COGNITIVE EASE . Things that are easier to compute, more familiar,
and easier to read seem more true than things that require hard thought, are novel, or
are hard to see. “Predictable illusions inevitably occur if a judgment is based on the
impression of cognitive ease or strain,” (page 62). “How do you know that a statement is
true? If it is strongly linked by logic or associat ion to other beliefs or preferences you hold,
or comes from a source you trust and like, you will feel a sense of cognitive ease,” (page

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 3 64). Because things that are familiar seem more tru e teachers, advertisers, marketers,
authoritarian tyrants, and even cult leaders repeat their message endlessly. Potential for
error? If we hear a lie often enough we tend to bel ieve it.

CHAPTER SIX: NORMS, SURPRISES, AND CAUSES

Heuristic #3: COHERENT STORIES (ASSOCIATIVE COHERENCE ). To make
sense of the world we tell ourselves stories about what’s going on. We make associations
between events, circumstances, and regular occurren ces. The more these events fit into
our stories the more normal they seem. Things that don’t occur as expected take us by
surprise. To fit those surprises into our world we tell ourselves new stories to make them
fit. We say, “Everything happens for a purpose,” “G od did it,” “That person acted out of
character,” or “That was so weird it can’t be rando m chance.” Abnormalities, anomalies,
and incongruities in daily living beg for coherent explanations. Often those explanations
involve 1) assuming intention, “It was meant to hap pen,” 2) causality, “They’re homeless
because they’re lazy,” or 3) interpreting providence , “There’s a divine purpose in
everything.” “We are evidently ready from birth to have impressions of causality, which
do not depend on reasoning about patterns of causat ion,” (page 76). “Your mind is ready
and even eager to identify agents, assign them pers onality traits and specific intentions,
and view their actions as expressing individual pro pensities,” (page 76). Potential for
error? We posit intention and agency where none exi sts, we confuse causality with
correlation, and we make more out of coincidences t han is statistically warranted.

CHAPTER SEVEN: A MACHINE FOR JUMPING TO CONCLUSIONS

Heuristic #4: CONFIRMATION BIAS . This is the tendency to search for and find
confirming evidence for a belief while overlooking counter examples. “Jumping to
conclusions is efficient if the conclusions are lik ely to be correct and the costs of an
occasional mistake acceptable, and if the jump save s much time and effort. Jumping to
conclusions is risky when the situation is unfamili ar, the stakes are high, and there is no
time to collect more information,” (page 79). Syste m 1 fills in ambiguity with automatic
guesses and interpretations that fit our stories. I t rarely considers other interpretations.
When System 1 makes a mistake System 2 jumps in to slow us down and consider
alternative explanations. “System 1 is gullible an d biased to believe, System 2 is in
charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy,”
(page 81). Potential for error? We are prone to ove r-estimate the probability of unlikely
events (irrational fears) and accept uncritically e very suggestion (credulity).

Heuristic #5: THE HALO EFFECT . “This is the tendency to like or dislike everythi ng
about a person—including things you have not observ ed,” (page 82). The warm emotion
we feel toward a person, place, or thing predispose s us to like everything about that
person, place, or thing. Good first impressions ten d to positively color later negative
impressions and conversely, negative first impressi ons can negatively color later positive
impressions. The first to speak their opinion in a meeting can “prime” others’ opinions. A
list of positive adjectives describing a person inf luences how we interpret negative

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 4 adjectives that come later in the list. Likewise, n egative adjectives listed early colors
later positive adjectives. The problem with all the se examples is that our intuitive
judgments are impulsive, not clearly thought throug h, or critically examined. To remind
System 1 to stay objective, to resist jumping to co nclusions, and to enlist the evaluative
skills of System 2, Kahneman coined the abbreviatio n, “WYSIATI,” what you see is all
there is. In other words, do not lean on informatio n based on impressions or intuitions.
Stay focused on the hard data before us. Combat ove r confidence by basing our beliefs not
on subjective feelings but critical thinking. Incre ase clear thinking by giving doubt and
ambiguity its day in court.

CHAPTER EIGHT: HOW JUDGMENTS HAPPEN

Heuristic #6: JUDGEMENT. System 1 relies on its intuition, the basic assessm ents of
what’s going on inside and outside the mind. It is prone to ignore “sum-like variables, ”
(page 93). We often fail to accurately calculate su ms but rely instead on often unreliable
intuitive averages. It is prone to “matching ,” (page 94). We automatically and
subconsciously rate the relative merits of a thing by matching dissimilar traits. We are
prone to evaluate a decision without distinguishing which variables are most important.
This is called the “mental shotgun” approach (page 95). These basic assessments can
easily replace the hard work System 2 must do to ma ke judgments.

CHAPTER NINE: AN EASIER QUESTION

Heuristic #7: SUBSTITUTION. When confronted with a perplexing problem, question ,
or decision, we make life easier for ourselves by a nswering a substitute, simpler question.
Instead of estimating the probability of a certain complex outcome we rely on an estimate
of another, less complex outcome. Instead of grapp ling with the mind-bending
philosophical question, “What is happiness?” we ans wer the easier question, “What is my
mood right now?” (page 98). Even though highly anxi ous people activate System 2 often,
obsessing and second guessing every decision, fear, or risk, it is surprising how often
System 1 works just fine for them. Even chronic wor riers function effortlessly in many
areas of life while System 1 is running in the back ground. They walk, eat, sleep, breath,
make choices, make judgments, trust, and engage in enterprises without fear, worry, or
anxiety. Why? They replace vexing problems with eas ier problems. Potential for error?
We never get around to answering the harder questio n.

Heuristic #8: AFFECT. Emotions influence judgment. “People let their like s and
dislikes determine their beliefs about the world,” (page 103). Potential for error? We can
let our emotional preferences cloud our judgment an d either under or over estimate risks
and benefits.

PART TWO: HEURISTICS AND BIASES

CHAPTER TEN: THE LAW OF SMALL NUMBERS

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 5 Heuristic #9: THE LAW OF SMALL NUMBERS. Our brains have a difficult time
with statistics. Small samples are more prone to ex treme outcomes than large samples,
but we tend to lend the outcomes of small samples m ore credence than statistics warrant.
System 1 is impressed with the outcome of small sam ples but shouldn’t be. Small
samples are not representative of large samples. La rge samples are more precise. We err
when we intuit rather than compute, (see page 113). Potential for error? We make
decisions on insufficient data.

Heuristic #10: CONFIDENCE OVER DOUBT . System 1 suppresses ambiguity and
doubt by constructing coherent stories from mere sc raps of data. System 2 is our inner
skeptic, weighing those stories, doubting them, and suspending judgment. But because
disbelief requires lots of work System 2 sometimes fails to do its job and allows us to slide
into certainty. We have a bias toward believing. Be cause our brains are pattern
recognition devices we tend to attribute causality where none exists. Regularities occur
at random. A coin flip of 50 heads in a row seems u nnatural but if one were to flip a coin
billions and billions of times the odds are that 50 heads in a row would eventually
happen. “When we detect what appears to be a rule, we quickly reject the idea that the
process is truly random,” (page 115). Attributing o ddities to chance takes work. It’s easier
to attribute them to some intelligent force in the universe. Kahneman advises, “accept
the different outcomes were due to blind luck” (pag e 116). There are many facts in this
world due to chance and do not lend themselves to e xplanations. Potential for error?
Making connections where none exists.

CHAPTER ELEVEN: ANCHORS

Heuristic #11: THE ANCHORING EFFECT. This is the subconscious phenomenon of
making incorrect estimates due to previously heard quantities. If I say the number 10
and ask you to estimate Gandhi’s age at death you’l l give a lower number than if I’d said
to you the number 65. People adjust the sound of th eir stereo volume according to
previous “anchors,” the parents’ anchor is low deci bels, the teenager’s anchor is high
decibels. People feel 35 mph is fast if they’ve bee n driving 10 mph but slow if they just
got off the freeway doing 65 mph. Buying a house fo r $200k seems high if the asking
price was raised from $180k but low if the asking p rice was lowered from $220k. A 15
minute wait to be served dinner in a restaurant see ms long if the sign in the window says,
“Dinner served in 10 minutes or less” but fast if t he sign says, “There is a 30 minute wait
before dinner will be served.” Potential for error? We are more suggestible than we
realize.

CHAPTER TWELVE: THE SCIENCE OF AVAILABIITY

Heuristic #12: THE AVAILABILITY HEURISTIC . When asked to estimate numbers
like the frequency of divorces in Hollywood, the nu mber of dangerous plants, or the
number of deaths by plane crash, the ease with whic h we retrieve an answer influences
the size of our answer. We’re prone to give bigger a nswers to questions that are easier to
retrieve. And answers are easier to retrieve when w e have had an emotional personal

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 6 experience. One who got mugged over-estimates the f requency of muggings, one exposed
to news about school shootings over-estimates the n umber of gun crimes, and the one
who does chores at home over estimates the percenta ge of the housework they do. When
both parties assume they do 70% of the house work s omebody is wrong because there’s no
such thing as 140%! A person who has experienced a tragedy will over estimate the
potential for risk, danger, and a hostile universe. A person untroubled by suffering will
under-estimate pending danger. When a friend gets c ancer we get a check up. When
nobody we know gets cancer we ignore the risk. Pote ntial for error: under or over
estimating the frequency of an event based on ease of retrieval rather than statistical
calculation.

CHAPTER THIRTEEN: AVAILABIITY, EMOTION, AND RISK

Heuristic #13: AVAILABILITY CASCADES. When news stories pile up our statistical
senses get warped. A recent plane crash makes us th ink air travel is more dangerous
than car travel. The more we fear air travel the mo re eager news reporters are to
sensationalize plane crashes. A negative feedback lo op is set in motion, a cascade of fear.
“The emotional tail wags the rational dog,” (page 1 40). Potential for error? Over reacting
to a minor problem simply because we hear a disprop ortionate number of negative news
stories than positive ones.

CHAPTER FOURTEEN: TOM W’S SPECIALTY

Heuristic #14: REPRESENTATIVENESS. Similar to profiling or stereotyping,
“representativeness” is the intuitive leap to make judgments based on how similar
something is to something we like without taking in to consideration other factors:
probability (likelihood), statistics (base rate), o r sampling sizes. Baseball scouts used to
recruit players based on how close their appearance resembled other good players. Once
players were recruited based on actual statistics t he level of gamesmanship improved.
Just because we like the design of a book cover doe sn’t mean we’ll like the contents. You
can’t judge a book by its cover. A start-up restaur ant has a low chance of survival
regardless of how much you like their food. Many we ll run companies keep their facilities
neat and tidy but a well kept lawn is no guarantee that the occupants inside are
organized. To discipline our lazy intuition we must make judgments based on probability
and base rates, and question our analysis of the ev idence used to come up with our
assumption in the first place. “Think like a statis tician,” (page 152). Potential for error:
Evaluating a person, place, or thing on how much it resembles something else without
taking into account other salient factors.

CHATPER FIFTEEN: LINDA: LESS IS MORE

Heuristic #15: THE CONJUNCTION FALLACY (violating the logic of probability).
After hearing priming details about a made up perso n (Linda), people chose a plausible
story over a probable story. Logically, it is more likely that a person will have one
characteristic than two characteristics. That is, a fter reading a priming description of

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 7 Linda respondents were more likely to give her two characteristics, which is statistically
improbable. It is more likely Linda would be a bank teller (one characteristic) than a
bank teller who is a feminist (two characteristics) . “The notions of coherence, plausibility,
and probability are easily confused by the unwary,” (page 159). The more details we add
to a description, forecast, or judgment the less li kely they are to be probable. Why? Stage
1 thinking overlooks logic in favor of a plausible story. Potential for error: committing a
logical fallacy, when our intuition favors what is plausible but improbable over what is
implausible and probable.

CHAPTER SIXTEEN: CAUSES TRUMP STATISTICS

Heuristic #16: OVERLOOKING STATISTICS. When given purely statistical data we
generally make accurate inferences. But when given statistical data and an individual
story that explains things we tend to go with the s tory rather than statistics. We favor
stories with explanatory power over mere data. Pote ntial for error: stereotyping, profiling,
and making general inferences from particular cases rather than making particular
inferences from general cases.

CHAPTER SEVENTEEN: REGRESSION TO THE MEAN

Heuristic #17: OVERLOOKING LUCK. Most people love to attach causal
interpretations to the fluctuations of random proce sses. “It is a mathematically inevitable
consequence of the fact that luck played a role in the outcome….Not a very satisfactory
theory—we would all prefer a causal account—but tha t is all there is,” (page 179). When
we remove causal stories and consider mere statisti cs we’ll observe regularities, what is
called the regression to the mean. Those statistica l regularities—regression to the
mean—are explanations (“things tend to even out”) b ut not causes (“that athlete had a
bad day but is now ‘hot’). “Our mind is strongly bi ased toward causal explanations and
does not deal well with ‘mere statistics,’” (page 1 82). Potential for error: seeing causes
that don’t exist.

CHAPTER EIGHTEEN: TAMING INTUITIVE PREDICTIONS

Heuristic #18: INTUITIVE PREDICTIONS. Conclusions we draw with strong
intuition (System 1) feed overconfidence. Just beca use a thing “feels right” (intuitive)
does not make it right. We need System 2 to slow do wn and examine our intuition,
estimate baselines, consider regression to the mean , evaluate the quality of evidence, and
so forth. “Extreme predictions and a willingness t o predict rare events from weak
evidence are both manifestations of System 1. It is natural for the associative machinery
to match the extremeness of predictions to the perc eived extremeness on which it is
based—this is how substitution works,” (page 194). Potential for error: unwarranted
confidence when we are in fact in error.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 8 PART THREE: OVERCONFIDENCE

CHAPTER NINETEEN: THE ILLUSION OF UNDERSTANDING

Heuristic #19: THE NARRATIVE FALLACY. In our continuous attempt to make
sense of the world we often create flawed explanato ry stories of the past that shape our
views of the world and expectations of the future. We assign larger roles to talent,
stupidity, and intentions than to luck. “Our comfor ting conviction that the world makes
sense rests on a secure foundation: our almost unli mited ability to ignore our ignorance,”
(page 201). This is most evident when we hear, “I k new that was going to happen!” Which
leads to…

Heuristic #20: THE HINDSIGHT ILLUSION . We think we understand the past,
which implies the future should be knowable, but in fact we understand the past less
than we believe we do. Our intuitions and premoniti ons feel more true after the fact.
Once an event takes place we forget what we believe d prior to that event, before we
changed our minds. Prior to 2008 financial pundits predicted a stock market crash but
they did not know it. Knowing means showing something to be true. Prior to 2008 no one
could show that a crash was true because it hadn’t happened yet. But after it happened
their hunches were retooled and become proofs. “The tendency to revise the history of
one’s beliefs in light of what actually happened pr oduces a robust cognitive illusion,”
(page 203). Potential for error: “We are prone to b lame decision makers for good decisions
that worked out badly and to give them too little c redit for successful moves that appear
obvious only after the fact. When the outcomes are bad, the clients often blame their
agents for not seeing the handwriting on the wall—f orgetting that it was written in
invisible ink that became legible only afterward. A ctions that seemed prudent in
foresight can look irresponsibly negligent in hinds ight,” (page 203).

CHAPTER TWENTY: THE ILLUSION OF VALIDITY

Heuristic #21: THE ILLUSION OF VALIDITY . We sometimes confidently believe our
opinions, predictions, and points of view are valid when confidence is unwarranted. Some
even cling with confidence to ideas in the face of counter evidence. “Subjective confidence
in a judgment is not a reasoned evaluation of the p robability that this judgment is correct.
Confidence is a feeling, which reflects the coheren ce of the information and the cognitive
ease of processing it” (page 212). Factors that con tribute to overconfidence: being dazzled
by one’s own brilliance, affiliating with like-mind ed peers, and over valuing our track
record of wins and ignoring our losses. Potential f or error: Basing the validity of a
judgment on the subjective experience of confidence rather than objective facts.
Confidence is no measure of accuracy.

CHAPTER TWENTY-ONE: INTUITIONS VS. FORMULAS

Heuristic #22: IGNORING ALGORITHMS. We overlook statistical information and
favor our gut feelings. Not good! Forecasting, pred icting the future of stocks, diseases, car

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 9 accidents, and weather should not be influenced by intuition but they often are. And
intuition is often wrong. We do well to consult che ck lists, statistics, and numerical
records and not rely on subjective feelings, hunche s, or intuition. Potential for error:
“relying on intuitive judgments for important decis ions if an algorithm is available that
will make fewer mistakes,” (page 229).

CHAPTER TWENTY-TWO: EXPERT INTUITION: WHEN CAN YOU TRUST IT?

Intuition means knowing something without knowing h ow we know it. Kahneman’s
understanding is that intuition is really a matter of recognition, being so familiar with
something we arrive at judgments quickly. Chess pla yers “see” the chess board, fire
fighters “know” when a building is about to collaps e, art dealers “identify” marks of
forgeries, parents have a “sixth sense” when their kids are in danger, readers “read”
letters and words quickly, and friends “are familia r” with their friends from a distance.
Kids become experts at video games, motorists becom e expert drivers, and chefs become
intuitive cooks. How? Recognition—either over long periods of exposure, or quickly in a
highly emotional event (accidents). Intuition is im mediate pattern recognition, not magic.

Heuristic #23: TRUSTING EXPERT INTUITION . “We are confident when the story
we tell ourselves comes easily to mind, with no con tradiction and no competing scenario.
But ease and coherence do not guarantee that a beli ef held with confidence is true. The
associative machine is set to suppress doubt and to evoke ideas and information that are
compatible with the currently dominant story,” (pag e 239). Kahneman is skeptical of
experts because they often overlook what they do no t know. Kahneman trusts experts
when two conditions are met: the expert is in an en vironment that is sufficiently regular
to be predictable and the expert has learned these regularities through prolonged
practice. Potential for error: being mislead by “ex perts.”

CHAPTER TWENTY-THREE: THE OUTSIDE VIEW

Heuristic #24: THE PLANNING FALACY means taking on a risky project—litigation,
war, opening a restaurant—confident of the best cas e scenario without seriously
considering the worst case scenario. If we consult others who’ve engaged in similar
projects we’ll get the outside view. Failure to do this increases the potential for failure.
Cost overruns, missed deadlines, loss of interest, waning urgency all result from poor
planning. Potential for error: “making decisions ba sed on delusional optimism rather
than on a rational weight of gains, losses, and pro babilities,” (page 252). In other words,
poorly planned grandiose projects will eventually f ail.

CHAPTER TWENTY-FOUR: THE ENGINE OF CAPITALISM

Heuristic #25: THE OPTIMISTIC BIAS. We are prone to neglect facts, others’ failures,
and what we don’t know in favor of what we know and how skilled we are. We believe the
outcome of our achievements lies entirely in our ow n hands while neglecting the luck
factor. We don’t appreciate the uncertainty of our environment. We suffer from the

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 10 illusion of control and neglect to look at the comp etition (in business start-ups for
example). “Experts who acknowledge the full extent of their ignorance may expect to be
replaced by more confident competitors, who are bet ter able to gain the trust of clients,”
(page 263). Being unsure is a sign of weakness so w e turn to confident experts who may
be wrong. Potential for error: unwarranted optimism which doesn’t calculate the odds
and therefore could be risky.

PART FOUR: CHOICES

CHAPTER TWENTY-FIVE: BERNOULLI’S ERRORS

Heuristic #26: OMITTING SUBJECTIVITY. We often think an object has only
intrinsic objective value. A million dollars is wor th a million dollars, right? Wrong.
Magically making a poor person’s portfolio worth a million dollars would be fabulous!
Magically making a billionaire’s portfolio a worth a million dollars would be agony! One
gained, the other lost. Economists have erred by fa iling to consider a person’s
psychological state regarding value, risk, anxiety, or happiness. 18 th century economist
Bernoulli thought money had utility (fixed worth) b ut he failed to consider a person’s
reference point. Potential for error: Making decisi ons on pure logic without considering
psychological states.

Heuristic #27: THEORY-INDUCED BLINDNESS . “Once you have accepted a theory
and used it as a tool in your thinking, it is extra ordinarily difficult to notice its flaws. If
you come upon an observation that does not seem to fit the model, you assume that there
must be a perfectly good explanation that you are s omehow missing,” (page 277). When
the blinders fall off the previously believed error seems absurd and the real
breakthrough occurs when you can’t remember why you didn’t see the obvious. Potential
for error: Clinging to old paradigms that have outl ived their validity.

CHAPTER TWENTY-SIX: PROSPECT THEORY

Kahneman’s claim to fame is Prospect Theory (for wh ich he won the Nobel prize in
economics). Economists used to believe that the val ue of money was the sole determinant
in explaining why people buy, spend, and gamble the way they do. Prospect Theory
changed that by explaining three things: 1) the val ue of money is less important then the
subjective experience of changes in one’s wealth. I n other words, the loss or gain of $500
is psychologically positive or negative depending o n a reference point, how much money
one already has. 2) We experience diminished sensit ivity to changes in wealth: losing
$100 hurts more if you start with $200 than if you start with $1000. And 3) we are loathe
to lose money!

Heuristic #28: LOSS AVERSION. “You just like winning and dislike losing—and you
almost certainly dislike losing more than you like winning,” (page 281). System 1
thinking compares the psychological benefit of gain with the psychological cost of loss

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 11 and the fear of loss usually wins. Potential for er ror: passing by a sure win in order to
avoid what we think might be a possible loss even w hen the odds are in favor of winning.

CHAPTER TWENTY-SEVEN: THE ENDOWMENT EFFECT

Heuristic #29 : THE ENDOWMENT EFFECT. An object we own and use is more
valuable to us than an object we don’t own and don’ t use. Such objects are endowed with
significance and we’re unwilling to part with them for two reasons: we hate loss and it
has a history with us. Thus we won’t sell a beloved , useful object unless a buyer offers
significant payment. Objects we don’t like or use s ell for less (or we even give them away).
Potential for error: Clinging to objects for sentim ental reasons at considerable loss of
income.

CHAPTER TWENTY-EIGHT: BAD EVENTS

Heuristic #30: LOSS AVERSION . People will work harder to avoid losses than to
achieve gains. Golfers putt for par to avoid bogeys (loosing points for going over par) than
for birdies (gaining points by putting under par). Contract negotiations stall when one
party feels they’re making more concessions (losses ) than their disputant. People will
work harder to avoid pain than to achieve pleasure. Even animals fight more fiercely to
maintain territory than to increase territory. Pote ntial for error: under estimating our
own and other’s attitudes toward loss/gain. They ar e asymmetrical.

CHAPTER TWNETY-NINE: THE FOURFOLD PATTERN

Heuristic #31: THE POSSIBILITY EFFECT. When highly unlikely outcomes are
weighted disproportionately more than they deserve we commit the possibility effect
heuristic. Think of buying lottery tickets.

Heuristic #32: THE CERTAINTY EFFECT. Outcomes that are almost certain are
given less weight than their probability justifies. Think of lawyers who offer a “less than
perfect” settlement before the trial which would re sult in an “almost certain victory.”

Heuristic #33: THE EXPECTATION PRINCIPLE . The two heuristics above have this
in common: “decision weights that people assign to outcomes are not identical to the
probabilities of these outcomes, contrary to the ex pectation principle” (page 312).

GAINS LOSSES
HIGH PROBABILITY
(Certainty effect) 95% chance to win $10,000. Fear of
disappointment, risk averse, accept
unfavorable settlement 95% chance to lose $10,000. Hope to
avoid loss, risk seeking, reject
favorable settlement.
LOW PROBABILITY
(Possibility effect) 5% change to win $10,000. Hope of large
gain, risk seeking, reject favorable
settlement. 5% chance to lose $10,000. Fear of
large loss, risk averse, accept favorable
settlement.

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 12 This means people attach values to gains and losses rather than wealth, and decision
weights assigned to outcomes are different from pro babilities. The fourfold pattern of
preferences accounts for this. Potential for error:

1. People are risk averse when they look at the prospe cts of a large gain. They’ll lock
in a sure gain and accept a less than expected valu e of the gamble.
2. When the result is extremely large, such as a lotte ry ticket, the buyer is indifferent
to the fact that their chance of winning is extreme ly small. Without the ticket they
cannot win, but with the ticket, they can at least dream.
3. This explains why people buy insurance. We’ll pay i nsurance because we’re buying
protection and peace of mind.
4. This explains why people take desperate gambles. Th ey accept a high probability
of just making things worse, for a chance of a slig ht ray of hope of avoiding the loss
they are facing. This type of risk taking can just turn a bad situation into a
disaster.

CHAPTER THIRTY: RARE EVENTS

Heuristic #34: OVERESTIMATING THE LIKELIHOOD OF RAR E EVENTS. It
makes more sense to pay attention to things that ar e likely to happen (rain tomorrow)
than about things that are unlikely to happen (terr orist attacks, asteroids, terminal
illness, floods and landslides). We tend to overest imate the probabilities of unlikely
events, and we tend to overweight the unlikely even ts in our decisions. This heuristic
joins forces with the availability cascade (#13) an d cognitive ease (#2) heuristics above.
We are more likely to choose the alternative in a d ecision which is described with explicit
vividness, repetition, and relative frequencies (vs . how likely). Potential for error:
succumbing to fear mongers who manipulate data in f avor of their cause.

CHAPTER THIRTY-ONE: RISK POLICIES

Heuristic #35: THINKING NARROWLY. Most of us are so risk averse we avoid all
gambles. This is wrong, says, Kahneman, since some gambles are clearly on our side and
by avoiding them we lose money. One way to decrease risk aversion is to think broadly,
looking at the aggregate wins over many small gambl es. Thinking narrowly, looking only
at short term losses, paralyzes us. But thinking bro adly is non-intuitive. It’s a System 2
task that takes work. We therefore are wired by Sys tem 1 to think irrationally
economically (saying no to easy money). The limit o f human rationality is so stark
Kahneman calls it a “hopeless mirage” (page 335). T he ideal of logical consistency is not
achievable by our limited minds. Potential for erro r: passing by risks in our favor.

CHAPTER THIRTY-TWO: KEEPING SCORE

Many have a System 1 calculator in their head that “keeps score” not only of the
potential financial gains and losses of a transacti on but also of the emotional risks,
rewards, and possible regrets of our financial deci sions. “The emotions that people attach

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 13 to the state of their mental accounts are not ackno wledged in standard economic theory,”
page 343).

Heuristic #36: THE DISPOSITION EFFFECT. We are often willing to sell money-
earning stocks because it makes us feel like wise i nvestors, and less willing to sell losing
stocks because it’s an admission of defeat. This is irrational since we’d earn more money
by selling the losers and clinging to the winners.

Heuristic #37: THE SUNK COST FALLACY. To avoid feeling bad about cutting our
losses and being called a failure, we tend to throw good money after bad, stay too long in
abusive marriages, and stay in unhappy careers. Thi s is optimism gone hay-wire.

Heuristic #38: FEAR OF REGRET. Regret is an emotion we’re familiar with and we
do well to avoid making decisions that lead to regr et. However, we’re terrible at
predicting how intense those feelings of regret wil l be. It often hurts less than we think.

CHAPTER THIRTY-THREE: REVERSALS

Heuristic #39: IGNORING JOINT EVALUATIONS. We make decisions differently
when asked to make them in isolation than when aske d to make them in comparison
with other scenarios. For example, a victim in a ro bbery will be awarded a higher
compensation when there are poignant factors involv ed (the victim was visiting a store
he rarely visited), but will be awarded a lower com pensation if harmed while in his usual
shopping location. When locations are compared (joi nt evaluation) we realize the victim’s
location is insignificant and we reverse our origin al compensation amount. “Joint
evaluations highlights a feature that was not notic eable in single evaluations but is
recognized as a decisive when detected,” (page 359). Potential for error: making decisions
in isolation. We should do comparison shopping, com pare sentences for crimes, and
compare salaries for different jobs. Failure to do so limits our exposure to helpful norms.

CHAPTER THIRTY-FOUR: FRAMES AND REALITY

Heuristic #40: IGNORING FRAMES. How a problem is framed determines our choices
more than purely rational considerations would impl y. More drivers sign the “donate
organ” card when they have to check the opt-in box, than drivers who must check the opt-
out box. We are more willing to pay extra for gas w hen using a credit card (vs. cash) if
the fee is framed as “loss of cash discount” than “ added credit card surcharge.” Doctors
prefer interventions where outcomes are a “one mont h survival rate of 90%” than to
interventions where outcomes are, “10% mortality ra te.” Both sentences mean the same
thing statistically but the frame of “survival” has greater emotional value than “mortality
rates.” “The meaning of a sentence is what happens in your associative machinery while
you understand it…In terms of the associations they bring to mind—how System 1 reacts
to them—the two sentences really ‘mean’ different t hings,” (page 363). “Reframing is
effortful and System 2 is lazy,” (page 367). Potenti al for error: Thinking we make

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 14 decisions in an objective bubble when in fact there are subjective factors at work about
which we are unaware.

PART FIVE: TWO SELVES

CHAPTER THIRTY-FIVE: TWO SELVES

Heuristic #41: IGNORING OUR TWO SELVES. We each have an “experiencing” self
and a “remembering” self. The latter usually takes precedence over the former. That is, I
can experience 13 days of vacation bliss but if on the 14 th day things go bad I tend to
remember the vacation as negative. My memory overri des my experience. Same with a
40 minute blissful record which ends with a scratch . We remember the scratch sound, not
the 39 previous minutes of musical enjoyment. “Conf using experience with the memory
of it is a compelling cognitive illusion—and it is the substitution that makes us believe a
past experience can be ruined. The experiencing sel f does not have a voice,” (page 381).

Heuristic #42: THE PEAK END RULE. How an experience ends seems to hold greater
weight in our memory than how an experience was liv ed. Similar to the previous
heuristic, the peak end rule is shorthand for remem bering only how an experience felt at
its end not at this worst moment.

Heuristic #43: DURATION NEGLECT . Another corollary of the two selves: the
duration of an unpleasant or pleasant experience do esn’t seem to be as important as the
memory of how painful or pleasurable the experience was.

CHAPTER THIRTY-SIX: LIFE AS HISTORY

Heuristic #44: NARRATIVE WHOLENESS (my user friendly name). When we
evaluate how well ours and others’ lives have been lived we do well to consider the whole
narrative and not just the end. But because of the previous three heuristics we are prone
to devalue a long, sacrificial, generous life if at the end (or even after death) we discover
episodes of selfishness, etc. “A story is about sig nificant events and memorable moments,
not about time passing. Duration neglect is normal in a story, and the ending often
defines its character” (page 386). Potential for er ror: paying more attention to longevity
than quality, making decisions based on how memorab le it will be rather than how
exciting and enriching the experience itself will b e, and experiencing a moment of
pleasure and forfeiting our reputation of integrity .

CHAPTER THIRTY-SEVEN: EXPERIENCED WELL BEING

Heuristic #45: VALUING A REMEMBERING SELF OVER AN E XPERIENCING
SELF. Since most of us rely on unreliable memories we do well to keep in mind what our
experiences were like during them, not just at thei r conclusion. How many of our waking
moments are spent in unpleasant emotions or negativ e states? They are hard to recall!
“Our emotional state is largely determined by what we attend to, and we are normally

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 15 focused on our current activity and immediate envir onment,” (page 394). A person stuck
in traffic can still be happy because they’re in lo ve, or a person who is grieving may still
remain depressed while watching a comedy. Potential for error: not paying attention to
what we are doing, letting experiences happen witho ut reflection, and going with the flow
with no attempt to alter our schedules, activities, or experiences.

CHAPTER THIRTY-EIGHT: THINKING ABOUT LIFE

Heuristic #46: AFFECTIVE FORECASTING. Which factor leads to a happier life
duration: or experiences? Would a 20 year life with many happy experiences be better
than a 60 year life with many terrible experiences? Which would you rather be: happy or
old? We are terrible at predicting what will make u s happy. When asked the very
difficult question, “Overall, how happy is your lif e?” we substitute an easier question,
“How happy am I right now?” (See heuristic #7). “…t he responses to global well-being
questions should be taken with a grain of salt” (pa ge 399). People make decisions based
on what will make them happy in the future but when it’s achieved the happiness doesn’t
last. We don’t know our future selves very well.

Heuristic #47: THE FOCUSING ILLUSION. “Nothing in life is as important as it is
when you are thinking about it,” (page 402). This m eans when we’re asked to evaluate a
decision, life satisfaction, or preference we err i f we focus on only one thing. How we
answer, “What would make you happy?” depends on man y factors and rarely is one factor
determinant. Yet folks regularly focus on one issue —income, weather, health,
relationships, pollution, etc.—and ignore other imp ortant factors. “How much pleasure do
you get from your car?” Depends on how much you val ue the stereo, mileage, looks, age,
cost, comfortable seats, tilt of steering wheel, et c. The fact is, our evaluations are often
based on the heuristic that while we are thinking o f a thing we generally think better of
it, forgetting how infrequently we actually think a bout those things (income, weather,
health, stereo, mileage, looks, etc). What initiall y strikes our fancy is absorbed into daily
living, we adapt, we acclimate, we experience the i nitial pleasure less intensely as time
progresses. “The remembering self is subject to a m assive focusing illusion about the life
that the experiencing self endures quite comfortabl y,” (page 406).

Heuristic #48: MISWANTING . (Daniel Gilbert’s phrase). We exaggerate the effe ct of a
significant purchase or changed circumstances on ou r future well-being. Things that are
initially exciting eventually lose their appeal.

CONCLUSIONS

SUMMARY OF THE TWO SELVES . It’s absurd that people willingly choose more pai n
for longer periods of time that end pleasantly over periods of less pain of shorter duration
and end terribly. But such are the powers of heuris tics #s 41, 42, 43, and 45.

SUMMARY OF ECONS AND HUMANS. Kahneman made infrequent mentions of
“econs and humans” so I do not emphasize them in my book summary. Here’s the gist of

Thinking Fast and Slow by Daniel Kahneman Summarized by Erik Johnson 16 his complaint. Economists (“the Chicago school”) op erate on the assumption that
consumers are rational (“internally consistent,” “l ogically coherent,” “adhering to rules of
logic,” page 411) and always will do the rational t hing. If not, that’s their loss. Kahneman
as a behavioral economist of course disagrees and s uggests that heuristics influence our
choices which are irrational and counter intuitive; we need help making better choices.
The Chicago School are libertarians who want govern ment to keep out of the way and let
people make their own choices, good or bad (provide d they don’t hurt others). Economic
behaviorists suggest giving people a nudge is somet imes necessary (regulation, writing
clearer contracts, truth in advertising, etc).

SUMMARY OF TWO SYSTEMS . “This book has described the workings of the mind as
an uneasy interaction between two fictitious charac ters: the automatic System 1 and the
effortful System 2,” (page 415).

SYSTEM 1 SYSTEM 2
Subconscious values, drives, beliefs that
influence our “gut reactions.” Articulates judgments, makes choices,
endorses or rationalizes ideas and feelings
Jumps to conclusions regarding causality. Makes up stories to either confirm or deny
those conclusions.
Operates effortlessly. Requires conscious effort to engage.
Can be wrong but is more often right. Can be wrong or right depending on how
hard it works.
Influenced by heuristics. Examines those heuristics when so
inclined.

“The way to block errors that originate in System 1 is simple in principle: recognize the
signs that you are in a cognitive minefield, slow d own, and ask for reinforcement from
System 2,” (page 417).

Similar Posts