Daniel Kahneman interview

by David Hall / 21 January, 2012

Help us find and write the stories Kiwis need to read

So many of us can look at a situation and get it horribly wrong. But Nobel Prize winner Daniel Kahneman says there are ways to improve your thinking to get it right.


Consider this fact: “Highly intelligent men tend to marry women who are less intelligent than they are.” Now, you probably have your suspicions why this is the case. Perhaps it’s something to do with what men and women really want, or how society shapes us, or the stupidity of intelligence tests. If so, you have overlooked the most obvious explanation of all: this outcome is statistically inevitable. Indeed, it is as inevitable as the fact that highly intelligent women tend to marry less intelligent men.

Think about it: if the IQs of most people belong to the average, then people far above the average must pair up more often than not with people who are closer to the average. (In maths-speak, this is called “regression to the mean”.) As long as people don’t always marry their intellectual equals, as long as one gender is not smarter than the other, then the above statement simply has to be true. So why do we overlook this obvious explanation and reach for a more controversial one first?

Over the past five decades, psychologist Daniel Kahneman has developed the heuristics and biases approach to human judgment, exploring problems like these. He argues that most judgments are products of heuristics: mental rules of thumb that provide us with rough and ready answers to problems.

These serve us well for much of the time, but in certain situations they lead us systematically astray. Our bias for causal explanations, as we have already seen, can steer us off course, leading us to search for causes that may well be irrelevant.

Kahneman’s approach has proven extraordinarily successful. His early research papers, co-written with collaborator Amos Tversky, remain among the most frequently cited in the social sciences. The work earned him the Nobel Prize in Economics in 2002 (Tversky died in 1996) and laid the foundations for behavioural economics, the study of how people actually make decisions, rather than how they ought to as rational agents. This in turn has further inspired a spate of recent best-sellers – Freakonomics, Nudge, Black Swan and Predictably Irrational among them – which have extended Kahneman’s influence.

Surprisingly, then, it is only now that Kahneman, aged 77, has published a book of his own for general readers: Thinking, Fast and Slow. Here is Kahneman in his own words, personable and even-handed, prodigious if a little diffuse, exposing the breadth and abundance of human daftness. The book is indispensable, its greatest strength its impressive hoard of anecdotes, mostly lifted from his research and the work of his contemporaries, but also drawing on personal experience. The Financial Times called the book a “masterpiece”; the ­Economist called it “profound”.

‘Without introspection you don’t get anywhere as a psychologist,” he tells me over the phone from Manhattan, and in Kahneman’s field this can be a humbling route to knowledge.

For instance, in the early 1970s, he was asked by the Israeli Ministry of Education to develop a textbook and curriculum on the topic of decision making. He assembled a team of contributors, including an expert on curriculum development, and after a year of excellent progress asked his team to estimate how long the project would take to complete. Everybody agreed another two years, give or take six months.

Kahneman then asked the curriculum expert how long similar projects took. Somewhat embarrassed, the expert contradicted his earlier estimate, confessing between seven and 10 years. Worse still, around 40% of projects were never completed. No one wanted involvement given those odds, but the team – Kahneman included – carried on as if nothing had happened. The project was eventually finished eight years later, by which time the Ministry of Education had lost all interest.

As Kahneman puts it, “This embarrassing episode remains one of the most instructive experiences of my professional life.” It revealed an excess of optimism, driven by biases that Kahneman would later distinguish and classify: we plan around best-case scenarios rather than statistically probable outcomes (the “planning fallacy”), we presume rosily that favourable circumstances in the present will pertain in the future (“what-you-see-is-all-there-is”) and we stubbornly refuse to trade in old beliefs for better ones (the “illusion of validity”).

Kahneman now believes these biases had a hand in the current financial crisis, producing what he describes as “a collective blindness to risk and uncertainty”. It is an intriguing suggestion and one that hints at the wider implications of his research, for economics and politics especially. As Thomas Hobbes advised long ago, read thyself, consider what you do when you think and you will know the thoughts of others on like occasions. And this is precisely how Kahneman’s thinking has advanced, finding the gaps in human rationality by finding them first in himself.

The nub of Kahneman’s book is the distinction between two types of thinking, the intuitive and the deliberative, the “fast” and “slow” of the title. Following convention, he calls these System 1 and System 2.

System 1, the intuitive mind, is fast, automatic and effortless. Its judgments arise unbidden in our minds as feelings, seemings or fully fledged convictions.
System 2, on the other hand, is the system we inhabit as conscious creatures, the arena of reason and deliberation. It is slow and labour-intensive, it applies rules and evidence, and it demands and directs our attention.
To sense the distinction, consider the following puzzle:

A bat and ball cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?

If you are like most people, your immediate answer will be 10c. But this is the wrong answer: a 10c ball and a $1.10 bat add up to a total cost of $1.20. To work out the correct answer, you need to slow down, to override the fast and frugal thinking of System 1 and employ the careful deliberation of System 2. (The answer is 5c.) We need to “pay attention”, as the idiom goes, and we pay with mental effort.

Yet because the brain is an organ, this extra effort costs energy. Indeed, if you (like I did) had to struggle to solve the bat and ball puzzle, you would have undergone some subtle physiological changes: your pupils would have dilated, your prefrontal cortex would have flushed with oxygen-rich blood, and your consumption of glucose would have heightened.

For this reason, System 2 is what Kahn­eman calls “a lazy controller”, abiding by “the law of least effort”. It has veto power, the power to reject an intuition and deduce a better answer, but it rarely ever does so, content to conserve energy instead by endorsing the guesstimates of System 1.

Most of the time, this works quite well. Indeed, waking life would be intolerable if we had to “make up our minds” about every­thing. But the speed and effortlessness of our snap judgments come at a cost to accuracy. Like some cunning riddle, the bat and ball puzzle exploits a particular weakness.

But other biases, such as our tendency to see patterns and causal relations in random events, are so broad and persistent that they’ve led some to question our very grip on reality. If our errors are not made apparent to us, if we are not trained to be sceptical, we may carry on none the wiser, “blind to our blindness” as Kahneman puts it.

So, what other cognitive illusions do we unwittingly live with? The “anchoring effect” is one: the tendency for judgments to be skewed by the presence of arbitrary numbers. For example, a study of experienced German judges showed that sentencing could be influenced by first rolling a pair of dice. When the judges rolled a three, they sentenced a (hypothetical) shoplifter to an average of five months in prison. If they rolled a nine, the average sentence was eight months.

The “availability bias” is another, the tendency for judgments to be skewed by memories that spring easily to mind. For example, Americans judge death by accident to be 300 times as likely as death by diabetes, even though the true figure is about 1.7. This misjudgment, Kahneman argues, reflects our taste for “novelty and poignancy”, compounded by our exposure to grisly instances in the media.

Similarly, the purchase of earthquake insurance in California spikes dramatically after a tremor, then recedes over the next few years, a pattern that poorly reflects our exposure to risk but accurately reflects how memories fade. (In regards to earthquakes closer to home, this should be a reminder to be vigilant, to question whether fresh memories are driving us to overestimate risk or jolting us out of our past inaction.)

Then there is the “illusion of skill”, the tendency to see talent where there is only luck. Kahneman describes once preparing for a meeting with a financial investment firm by running some simple calculations on their data. He was searching for correlations in the year-by-year performances of each adviser, for consistency in their successes, a sign that talent was cutting through the fluctuations of the market.

Instead, Kahneman found no correlation, a result that surprised even him. Yet when he alerted the firm’s executives, suggesting that end-of-year bonuses were rewarding luck rather than competency, they took the news in their stride, neither quite denying nor accepting the fact. As one executive politely told him, “I have done very well for this firm and no one can take that away from me.” (“Well, I just did,” Kahneman remembers thinking.)

We are also more averse to losses than we are attracted to equivalent gains. Indeed, most people will only risk $100 on a coin toss if the potential win is $200, no less than twice the equivalent loss. This “loss aversion” subtly infects all sorts of decisions. For example, many people will agree to pay $5 for a lottery ticket that offers a 10% chance of winning $100 and a 90% chance of winning nothing. Far fewer, however, will accept a gamble that offers a 10% chance of winning $95 and a 90% chance of losing $5. The gambles are identical, of course, the only difference being in how they are framed, either as risking a $5 cost or a $5 loss. Yet it is the latter that invokes the stronger feelings, so we prefer to be better safe than sorry.

And so it goes on, the reader soon swimming against a rising tide of heuristics and biases, searching for solid ground. Unfortunately, on this point, Kahneman is pessimistic.

“It is very difficult for people to detect their own errors,” he says. “You are too busy making a mistake to detect it at the same time, so I don’t believe this is practical advice for people.”

Our biases are like optical illusions, he says, but illusions of cognition rather than perception; and just like optical illusions, cognitive illusions persist even when we know about them. For example, the horizontal lines in the famous Müller-Lyer illusion (pictured right) appear to be different in length, even after we have measured them and proven to ourselves they are identical. Similarly, after many years of research, Kahneman declares his biases undimmed, as treacherous as ever. His only progress, he says, is recognising which situations will trigger them.

“The only real advice given in my book,” he says, “is that when the stakes are high, you should slow down.”

This advice is largely pragmatic. System 1 is sneaky and persuasive, so Kahneman thinks we should save our efforts for really important decisions – otherwise, minor decisions will become more unpleasant than they need to be.
He also recommends looking for an “outside view”, for a sober vantage point beyond the flurry of subjective experience.

“It helps to have friends you can consult who are sophisticated about biases,” he tells me. “Typically, when you are making a big mistake, you are stirred up and emotionally involved, so somebody who is objective and watching with a sense of irony is more likely to see what you don’t see.”

This is why Kahneman coins a term for every little quirk, finishing each chapter with common-use examples. He wants to enrich our language, so we can identify biases when we see them and alert others to mistakes they are about to make.

Checklists are another way to promote objectivity. In job interviews, for example, the “halo effect” causes people to weigh first impressions too heavily, to ignore a person’s drawbacks if they charm at first flush. In interviews and clinical consultations, Kahneman favours statistical analysis of standardised questionnaires. Holistic judgments of well-being or character, at least on their own, are overly prone to distortion.

Yet we should not take Kahneman to be suggesting that intuitions are all bad. He recognises the value of expert intuitions – the kind celebrated in Malcolm Gladwell’s Blink – that are distillations of thousands of hours of practice and experience. And our common intuitions, the kind Kahneman focuses on, are usually serviceable, if vulnerable to influence.

He also points out that intuitive judgments have the appearance of decisiveness and integrity, important qualities for public decision makers. Think about the different leadership styles of George W Bush and Barack Obama, dominated respectively by System 1 and System 2, each suffering a particular disadvantage.

“When national leaders act upon their impulses or their gut, they are likely to make mistakes,” Kahneman says. “On the other hand, when you have leaders who are overly rational, the population does not like them.”

Ultimately, Kahneman seems to want us to judge our biases on their outcome rather than their truthfulness, an important distinction to make. Our beliefs might sometimes be biased, but we should not assume that unbiased beliefs will always bear the greater rewards.

For instance, optimism produces an unrealistic view of the future, but it also prompts people to take risks that, if they are lucky, will lead to success. Furthermore, irrespective of success, optimistic people generally have stronger immunity, lower chances of depression, and a longer life expectancy. Sometimes it’s good to be not quite right.

Born into a Jewish family in 1934, Kahneman spent World War II living precariously in France. Once, while sneaking home after the six o’clock curfew, he was called over by an SS officer, and was terrified when the officer picked him up and hugged him. The man opened his wallet, speaking emotionally in German, to reveal a picture of a boy, then gave him some money and sent him on his way. The episode solidified Kahneman’s childhood hunch that
people are “endlessly complicated and interesting”.

In basic economic theory, however, people are neither complicated nor interesting. Homo economicus, the standard economic agent, is actually rather dull: rational, self-interested and unchanging in tastes. For many decades, this convenient fiction has been used by economists to develop policies that produce the best possible outcomes for creatures such as these. The question these days is: do these policies also produce favourable outcomes for creatures such as us?

As a matter of science and common sense, Homo sapiens is obviously very different. We frequently fall short of rationality; we often act (like the SS officer) on sentimental motives; and we change our preferences across time and situations. The aim of behavioural economists is to introduce – or re-introduce – these empirical realities into economic theory. With the authority of science on its side, it is tempting to think, as one reviewer recently put it, that rational-choice theory is “on its way out”. Yet this somewhat misses the point.

“[Rational-choice theory] certainly isn’t dead,” Kahneman agrees. “It’s fine as a normative model, as a model of how people should act. But as a descriptive model, a theory of how they do act, it’s good enough for some purposes, not very good for others.

“There are no true theories in this field,” he continues. “It is really a matter of how useful the approximation is. Behavioural economics does much better predicting political choices or how much people will save, but in other domains, such as auctions and some interactions between sophisticated players, rational-choice theory works just fine.”

Indeed, rational-choice theory is essential to Kahneman’s work: it is the yardstick against which actual human judgment is measured. After all, our cognitive biases are biases from what is rationally optimal or statistically accurate. For good reason, then, Kahneman complains when some see him as a “prophet of irrationality”.

“It irritates me but I can’t shake that label. To me, irrational means impulsive and emotional, and that is really not what we have described.”

Nevertheless, the fact that people are often less than rational raises problems for any policy that presumes otherwise. After all, if real people enter an economic system designed for perfectly rational actors, who knows what the outcome will be.

On this point, Kahneman disagrees with classic libertarians. The practical argument for minimal states and unregulated economies is that people are rational, so we can expect them – not politicians or bureaucrats – to make decisions that best serve their interests.

For Kahneman, though, this justification fails on point of empirical fact. People systematically make choices that do not promote their interests, and biases are often to blame. Consequently, Kahneman thinks that states have a role in protecting people from their mistakes. If friends and checklists cannot deter you from poor judgment, perhaps the government can.

Cass Sunstein and Richard Thaler have extended this line of thinking in their book, Nudge. In it they explore the idea of “libertarian paternalism”, libertarian insofar as people are left to make their own choices, paternalistic insofar as choices are arranged to induce certain outcomes.

For example, instead of banning unhealthy foods in school cafeterias, governments can regulate how food is laid out, placing healthy food near the front or at eye level where it is more likely to be picked. Just as marketers use our psychology to increase consumption, governments can “nudge” citizens into making choices that promote their long-term interests, including organ donation, retirement savings and energy efficiency.

These ideas have proven popular in policy circles; notably, British Prime Minister David Cameron created a “nudge unit” soon after taking office. A major appeal is that nudges are palatable across the political divide, enabling social reform while largely avoiding the heavy-handed tactics of the “nanny state”. On a more cynical reading, though, nudges are simply an excuse for political apathy, a way to avoid taking necessary but unpopular action, while still being seen to act.

Kahneman, at least, is enthusiastic about the project: “Sunstein and Thaler are close friends of mine; I support everything they do.” He visited Britain’s “nudge unit” only recently, part of an exhausting publicity tour that is drawing to a close. (For an example of how “nudging” has been used in New Zealand, see the November 26 Listener or go to our will power article.)

But on the wider implications of his research, Kahneman is unforthcoming. He confines such discussion mostly to the last 10 pages of his 418-page book, and directs all further queries there. Fair enough, perhaps, for an empirical researcher, but it leaves some thorny questions unanswered.

For instance, what about diversity? Kahne­man writes and speaks – as psychologists generally do – in a tone of universalism, as if biases are spread evenly across humanity. Yet most psychological research is conducted on a global minority: on undergraduate students from nations that are distinctively WEIRD (that is, Western, educated, industrialised, rich, and democratic).

Cross-cultural research suggests that certain intuitions vary substantially between societies and, moreover, that Western intuitions are outliers from the global average. This is arguably the case for certain economic bargaining games, but also even for the Müller-Lyer illusion, to which San foragers of the Kalahari are all but immune and Americans abnormally susceptible.

This is not necessarily a major problem for Kahneman – cultures may simply acquire biases appropriate to their surroundings – but it could restrict the reach of some experimental anecdotes. Such are the limits of introspection as a source of knowledge about all of humanity.

And what about the possibility of impartiality? Consider, for instance, a recent experiment that tracked approval rates at parole hearings among Israeli judges. Over a random selection of hearings, parole approval rates ought to remain the same, yet the approvals trailed off three times each day, once for every meal. The effect was dramatic, from an average of 65% approval directly after a meal to almost zero before the next.

So, what should the approval rate be? Or, more pertinently, what level of appetite produces an impartial decision? Should we reproach tired and hungry judges for being too severe, or well-rested judges for being overly lenient? And how about us, we who judge the judges? Are we not perhaps a little peckish also, a little “hangry” even?

Here we might share Kant’s despair: “Out of the crooked timber of humanity, no straight thing was ever made.” If everyone is susceptible to this multitude of biases, who do we turn to for the final word?

Kant turned to God, as the embodiment of reason, but we could also take our “crookedness” into our stride. The recent financial crisis is a magnificent example of the hazards of overconfidence, and hopefully an equally magnificent opportunity for self-correction. By being more honest about our shortcomings and more humble about our capabilities, we could find systems and policies better suited to people like us.

David Hall is a NZ doctoral researcher in political theory at the University of Oxford.

The Linda Problem


In his most famous experiment, Kahneman and his distinguished collaborator Amos Tversky asked subjects about Linda. Here’s how they described her:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Then they ask this simple question:
Which alternative is more probable:
1. Linda is a bank teller; or
2. Linda is a bank teller and is active in the feminist movement?


About 85-90% of undergraduates at several major US universities chose the second option. But logically, both things being true must be less likely than just one of them being true: there have to be more bank tellers than feminist bank tellers, so bank teller is the right answer.
The naturalist Stephen Jay Gould got the answer correct but still struggled with it, saying “a little homunculus in my head continues to jump up and down, shouting at me, ‘But she can’t just be a bank teller; read the description.’” Kahneman responded that the “little homunculus” is Gould’s System 1 (fast, intuitive) mode of human reasoning speaking to him in insistent tones, adding that as in the Müller-Lyer illusion (see opposite), “the fallacy remains attractive even when you recognise it for what it is”.

The Müller-Lyer illusion


The horizontal lines appear to be different in length, even after we have measured them and proven to ourselves they are identical. San foragers of the Kalahari are all but immune to this illusion, while Americans are abnormally susceptible.

The biases that drive us


To be a good diagnostician, a doctor needs to learn the language of medicine. To be a good thinker, with a deeper understanding of the judgments and choices of ourselves and others, also requires a richer vocabulary. There are distinct patterns, says Daniel Kahneman, in the errors people make based on the predictable biases in human thinking.

“When the handsome and confident speaker bounds onto the stage, for example, you can anticipate that the audience will judge his comments more favourably than he deserves. The availability of a diagnostic label for this bias – the halo effect – makes it easier to anticipate, recognise and understand.”

Here are some of the most common biases:

The halo effect: We put too much weight on first impressions.

The anchoring effect: We adjust our estimates to accommodate arbitrary numbers.

The availability heuristic: We base our judgments on readily available memories.

The affect heuristic: We put too much weight on judgments that are emotionally laden.

Base-rate neglect: We accept what is causally possible over what is statistically probable.

Competition neglect: We expect outcomes to be determined by our efforts alone, not the influence of competitors.
Framing effects: We vary our judgments depending on how identical information is presented or framed.

Hindsight bias: We overestimate the accuracy of our past predictions, believing that we knew it all along.

The illusion of skill: We attribute success to talent rather than luck.

The illusion of validity: We hold on to our beliefs in the face of contradictory evidence.

The planning fallacy: We plan around best-case scenarios rather than what is statistically likely.

Loss aversion: We are more averse to losses than we are attracted to equivalent gains.

Narrative fallacy: We create coherent causal stories to make sense of haphazard events.

Priming effects: We overemphasise a concept if we are “primed” with a related concept.

Representativeness bias: We lean heavily on stereotypes to compensate for partial information.

Substitution: We tackle a difficult question by answering a much simpler related question.

The sunk-cost fallacy: We continue investing in an established project rather than focus on its future outcomes.

What-you-see-is-all-there-is (or WYSIATI): We draw strong conclusions from incomplete information.

Latest

Father figure: Jordan Watson on his 'How to Dad' series
93157 2018-07-21 00:00:00Z Social issues

Father figure: Jordan Watson on his 'How to Dad' s…

by North & South

The breakout Youtube star talks about 'How to Dad', paternity leave, and his own dad.

Read more
With friends like Donald Trump, who needs enemies?
93834 2018-07-21 00:00:00Z World

With friends like Donald Trump, who needs enemies?…

by Paul Thomas

The US President treats his Western allies to a tongue-lashing while cosying up to Vladimir Putin, causing alarm at home and around the world.

Read more
Who Is America? is predictably alarming – and scarily relevant
93831 2018-07-21 00:00:00Z Television

Who Is America? is predictably alarming – and scar…

by Diana Wichtel

Only Bernie Sanders comes out unscathed in Sacha Baron Cohen’s absurdist new series Who Is America?

Read more
Organic wine is getting bigger in New Zealand. These are our top picks
93885 2018-07-21 00:00:00Z Wine

Organic wine is getting bigger in New Zealand. The…

by Michael Cooper

Quality rather than quantity drives New Zealand's organic wine producers.

Read more
Killer robots: The question of how to control lethal autonomous weapons
93876 2018-07-20 08:23:45Z Tech

Killer robots: The question of how to control leth…

by Peter Griffin

The computer scientist who has become a leading voice on the threat posed by killer robots describes himself as an “accidental activist”.

Read more
The man who's making sure performing artists are seen in the regions
93813 2018-07-20 00:00:00Z Theatre

The man who's making sure performing artists are s…

by Elisabeth Easther

For 35 years, Steve Thomas has been at the helm of Arts On Tour, taking musical and theatrical acts from Kaitaia to Stewart Island.

Read more
The Eco Economy: Millennials, money and saving sustainably
93645 2018-07-20 00:00:00Z Economy

The Eco Economy: Millennials, money and saving sus…

by Sharon Stephenson

Millenials are leading the rise of the eco economy.

Read more
Cuba Libre is a new Caribbean-influenced restaurant-bar in Ponsonby
93862 2018-07-19 15:05:51Z Auckland Eats

Cuba Libre is a new Caribbean-influenced restauran…

by Kate Richards

Rum, cigars and Cuban sandwiches are on the menu at new Ponsonby restaurant, Cuba Libre.

Read more