This site is not maintained. Click here for the new website of Richard Dawkins.

Sunk Costs and Expensive Beliefs

While it has come up once or twice, I rarely if ever see the Sunk Cost Fallacy invoked to explain aspects of believers' behaviours. This seems a shame, because it strikes me as performing a great amount of work in explaining whether or not a person will change their minds when confronted with new information.

Let's look at it from a biological point of view. What is the advantage, to a person's mind or to a gene coding for behaviour, of indulging the sunk costs fallacy in intellectual matters?

The first thing to admit is that finding and keeping truths is a tool that evolved for the benefit of genes. It would be nice if our neurological equipment was better designed for intellectual matters, but this same equipment was also selected for survival and reproduction, and whenever they clash, survival and reproduction would trump fact-finding.

Curiosity, for instance, quickly vanishes when danger or insecurity is threatened. This idea undercut Bowlby's attachment theory, which suggested that children regulate between exploratory and comfort-seeking behaviours depending on factors such as whether the environment was unfamiliar, whether the mother was around, how long mother stayed around (youngsters generally stopped seeking comfort and went back to exploring if mum stayed long enough), and based on personality styles. Ainsworth tested this idea by an experiment called the "Strange Situation", which in one case showed that two-third of those tested did adhere to the predicted behaviours.

The second thing to admit is that a sunk cost fallacy is only a fallacy if the lost investment cannot yield a payoff that's contingent on further investment. In other words, if throwing good money after bad just increases the loss or has no effect, then it is a case of the fallacy. It is not a case of the fallacy if throwing good money succeeds in turning loss into gain, or succeeds in avoiding a worse loss. What may be like the fallacy in one situation may actually be a rational move in another. If you cannot predict all the situations in advance, you may be better off throwing good money in case it's not after bad, in which case you'd be using the heuristic. Genetic strategies often use heuristics. This forms some of the basis for Optimal Foraging Theory.

For example, if a predator expends a lot of energy chasing after prey, yet fails to catch it, its efforts to catch prey will continue possibly more urgently with each failure. In this case, chasing after a sunk cost by stalking more prey would not be an example of the fallacy because increased investment of time and energy into procuring food would pay off if only it could catch a meal once, which is a feasible outcome of the next hunt. Big cats do indeed behave like this. Statistics like "only one in every ten hunting excursions being successful" are typical of such predators, with the odds varying from species to species (lions improve their chances by working as a team).

How do these two processes work in cases where people believe things that are contradicted by new information? Given the concession that intellectual matters can be trumped by survival or reproductive concerns, it's possible that believing certain things but not others will constitute a survival or reproductive advantage. This does not have to be content-dependent.

It might, for instance, pay an organism to believe things that fellow organisms do, perhaps because this reduces disputes and encourages ease of cooperation among a social species. And being able to move up the social ladder, whatever that locally requires, would enable one access to more desirable mates. In effect, knowing the right stuff makes you more attractive to potential partners, or at least more attractive than people who don't know the right stuff or who know the wrong stuff. The self-perpetuating nature of this system has more than a little in common with sexual selection, specifically Zahavi's handicap principle, by which a display of unfakeable genetic fitness (say, lack of impairment in mental faculties) promotes itself along with ability to discern said fitness at face value. It's peculiar that having to negate a proposition is slightly less pleasant than reinforcing a positive one, suggesting that pleasure in belief is a sign that one is doing something right. Indeed, this is a fair description of motivated reasoning, and support for the idea comes from neuroscientific studies like this one:

To investigate the process of adopting or rejecting religious beliefs and how it relates with religiosity, we performed a nonparametric analysis. Disagreeing (compared to agreeing) with religious statements among religious (compared to nonreligious) participants engaged bilateral anterior insulae and middle cingulate gyri. The anterior insulae are key areas for emotional-cognitive integration (45), and insular recruitment for rejecting religious beliefs implies a greater role of emotions in the process. Religious subjects may have experienced negative emotions triggered by religious disagreement, such as aversion, guilt, or fear of loss (23, 46, 47), perhaps because the stakes for detecting and rejecting religious statements inconsistent with their religious beliefs were higher in this group. For the same reason, this group may have experienced higher cognitive conflict manifested by middle cingulate gyri recruitment (48, 49).

It takes energy and time to reconfigure the brain to accept information, especially during childhood. This means we could potentially have a sunk cost if later evidence suggests it was a waste of resources. This is where the sunk costs come in.

Since believing in something is to make an investment in cognitive and temporal resources, and moreover that it's a social investment as well into the local culture, any sign that one is wrong constitutes not just a loss of investment but a social threat. If what one believes is incorrect, then one has been wasting one's time and mental resources in them. A decision comes up: switch, or stick?

There are many disadvantages to switching. For one thing, the new information has to be calibrated into the mind, and that requires more mental effort than usual. Switching is a dangerous social act; after all, you got your pre-existing beliefs from the people you spent most of your life with anyway (peers, mainly), who constitute your best local chances for reproduction in the majority of situations. And the new expenditure cannot be hastily rushed into. Energy might be wasted chasing after a new idea that turns out to be wrong, and the less information is conveyed, the more the jump becomes one of guesswork and foolhardiness.

Sticking, on the other hand, is more likely to pay off. For one thing, you fit right in with your peers, and the group members enjoy the advantages of internal cohesion that aren't available to a more divisive group's members. For another, it constitutes less energy expenditure, and the odds are that the belief is OK enough to be getting on with for your lifetime. A belief about, say, cosmological views or about how the world works is not going to have too much impact on your survival and reproductive strategy for the most part, except to enhance them among fellow believers. And this enhancement is among a social species that evolved language, which according to studies spend most of their time using this language to discuss other people's good and bad social behaviour (i.e. gossiping). Social benefit is key to understanding the phenomenon.

The sunk cost fallacy comes in when you take a more scientifically-minded and broader view. Obtaining beliefs over time requires expensive mental and neurological commitment. Changing beliefs is painful, as many here (I guess) can attest to long, gradual, and emotional changes from religion to non-religion. Being able to explore and analyze beliefs is not encouraged by intuitive thought processes, and we should by now be familiar with intuition's unconscious power to lead the way at the expense of more cautious kinds of thinking. Studies that can be found via access to this Wikipedia article on the subject tend to show that:

The idea that analytical thinking makes one less likely to be religious is an idea supported by other early studies on this issue11 including a report from Harvard University.10 First of all, the Harvard researchers found evidence suggesting that all religious beliefs become more confident when participants are thinking intuitively (atheist and theists each become more convinced). Thus reflective thinking generally tends to create more qualified, doubted belief.

On the other hand, the Harvard study found that participants who tended to think more reflectively were less likely to believe in God.10 Reflective thinking was further correlated with greater changes in beliefs since childhood: these changes were towards atheism for the most reflective participants, and towards greater belief in God for the most intuitive thinkers. The study controlled for personality differences and cognitive ability, suggesting the differences were due to thinking styles - not simply IQ or raw cognitive ability.10 An experiment in the study found that participants moved towards greater belief in God after writing essays about how intuition yielded a right answer or reflection yielded a wrong answer (and conversely, towards atheism if primed to think about either a failure of intuition or success of reflection). The authors say it is all evidence that a relevant factor in religious belief is thinking style.10 The authors add that, even if intuitive thinking tends to increase belief in God, "it does not follow that reliance on intuition is always irrational or unjustified."10

The energy and time invested in beliefs that turn out to be false requires correspondingly more investment in energy and time to undo, and if our more recent prehistoric hominid ancestors had good reasons not to do so as a heuristic, even though the modern day descendents would have been better off not inheriting the strategy in some cases, then genes will simply be too slow and too uninterested to catch up.

I think this sunk-costs heuristic view should make us more sympathetic to people who are caught up, through traditions, fundamentalism, and the trap of their own cultures, in a mental rut which they might not even be aware exists. This apparently irrational behaviour becomes the unfortunate consequence of a rational strategy once you consider the evolutionary and neurological accounts of costs and decisions involved.

It also poses a chilling question for even those of us who like to think we are more rational than they: Perhaps we too fall under the same possible affliction somewhere in our lives? This is one reason why being impartial and disinterested is so crucial to science, yet why individual scientists can end up indulging a pet hypothesis far more than the case for it can justify.

TAGGED: PSYCHOLOGY, REASON, RELIGION


Comments

Comment RSS Feed

Please sign in or register to comment