This site is not maintained. Click here for the new website of Richard Dawkins.

Why smart people are stupid

Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?

The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)

For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.

When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.

Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one eminent American philosopher, after hearing about his research, quickly turned away, saying, “I am not interested in the psychology of stupidity.”

The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.

West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here’s a example:

In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?

Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that’s wrong. The correct solution is forty-seven days.

West also gave a puzzle that measured subjects’ vulnerability to something called “anchoring bias,” which Kahneman and Tversky had demonstrated in the nineteen-seventies. Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students exposed to a small “anchor”—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold.

But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”

Read more

TAGGED: BEHAVIOR, CRITICAL THINKING


RELATED CONTENT

The Opposite of Debunking

Kyle Hill - JREF Comments

If we want people to understand the full range of skepticism we have to also stress the affirmatives. We need to live up to the charge of promoting science and critical thinking

Miracle buster: Why I traced holy water...

Jon White - New Scientist Comments

Indian rationalist Sanal Edamaruku faces a Catholic backlash after insisting that the "holy" water dripping from a statue of Christ came from a leaky drain

Book Excerpt: Why We Blame God for Our...

Matthew Hutson - Wired Comments

"If there's no obvious responsible party, we find a scapegoat. And what happens if no acceptable scapegoats are in sight? We credit a supernatural one."

One in seven thinks end of world is...

Chris Michaud - Reuters 91 Comments

Nearly 15 percent of people worldwide believe the world will end during their lifetime and 10 percent think the Mayan calendar could signify it will happen in 2012.

How Critical Thinkers Lose Their Faith...

Daisy Grewal - Scientific American 41 Comments

How Critical Thinkers Lose Their Faith in God

Losing Your Religion: Analytic Thinking...

Marina Krakovsky - Scientific American 55 Comments

Losing Your Religion: Analytic Thinking Can Undermine Belief

MORE

MORE BY JONAH LEHRER

Why We Don't Believe in Science

Jonah Lehrer - The New Yorker 106 Comments

Science is about learning new theories, but also learning to disregard our instincts.

Does Preschool Matter?

Jonah Lehrer - Wired 20 Comments


Does Preschool Matter?

The truth wears off

Jonah Lehrer - The New Yorker 33 Comments

Francis Bacon, the early-modern philosopher and pioneer of the scientific method, once declared that experiments were essential, because they allowed us to “put nature to the question.” But it appears that nature often gives us different answers.

Under Pressure: The Search for a Stress...

Jonah Lehrer - Wired 23 Comments

Depression’s Upside

Jonah Lehrer - The New York Times 15 Comments

MORE

Comments

Comment RSS Feed

Please sign in or register to comment