Pages

Monday 21 May 2012

More limitations of the human mind

1.  We believe what we want to believe

This is the big one.  Psychologists have argued convincingly what many of us might already suspect intuitively: "People are more likely to arrive at those conclusions that they want to arrive at."[1]  There is, of course, a circularity here - the very fact that many of us already intuitively suspect that this is the case predisposes us to find the psychological evidence for it persuasive - but that doesn't mean that it's not true.  The phenomenon is sometimes called "motivated reasoning".  Its intellectual pedigree goes back at least to Freud, and arguably to David Hume.

To take one concrete example relating to politics, a neuroimaging study showed that supporters of Bush and Kerry in the 2004 American presidential election used sections of the brain associated with emotion rather than reason when confronted with challenging material relating to the different candidates.[2]  Some particularly interesting work in this area has examined our moral attitudes, suggesting that we tend to decide moral issues intuitively rather than rationally - the reasoning that we then construct to justify our beliefs is ex post facto:
The bitterness, futility, and self-righteousness of most moral arguments can now be explicated....  [B]oth sides believe that their positions are based on reasoning about the facts and issues involved....  Both sides present what they take to be excellent arguments in support of their positions.  Both sides expect the other side to be responsive to such reasons....  When the other side fails to be affected by such good reasons, each side concludes that the other side must be closed-minded or insincere.[3]
We can't assume that intelligence alone is enough to guard against this.  Indeed, Michael Shermer argues in Why People Believe Weird Things that intelligent people have precisely the skills that are required for self-deception: "Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons."

Notes

1 - Kunda, "The Case for Motivated Reasoning", Psychological Bulletin 108 (1990), 480-498
2 - Westen, Blagov, Harenski, Kilts and Hamann, "Neural Bases of Motivated Reasoning", Journal of Cognitive Neuroscience 18 (2006), 1947–1958
3 - Haidt, "The Emotional Dog and its Rational Tail", Psychological Review 108 (2001), 814-834


2.  We don't change our minds

The evidence indicates that people prefer to remain committed to their existing beliefs than to be swayed by efforts to persuade them away from them.[1]  New evidence is assessed by us in accordance with our existing attitudes, and it is often simply taken as confirming them.  In one classic experiment, groups of subjects with opposing views on capital punishment were shown evidence supporting both sides of the question.  Each group readily accepted the evidence supporting their existing beliefs, doubted the evidence on the other side, and ended up more convinced than ever of their own rightness.[2]   Something similar happened in a later study exploring attitudes towards homosexuality.[3]  Appealing to expert opinion is no good because our pre-existing beliefs affect our willingness to trust particular experts.[4]  These phenomena are often classified under the headings of "confirmation bias" and the "backfire effect".

The classic enquiry into our unwillingness to change our minds was undertaken by Leon Festinger and his colleagues in the 1956 book When Prophecy Fails.  Festinger found, when investigating an apocalyptic religious cult, that the failure of a crucial prophecy was quickly explained away, and that the failure increased rather than decreased the faith of the cult members.

This isn't to say that we're completely impervious to persuasive evidence, otherwise no-one would ever change their mind about anything.  But it does mean - and, again, most of us have always known this from experience - that it takes more to change peope's minds than presenting seemingly credible evidence that they are wrong.

Notes

1 - Marilyn, Chambliss and Garner, "Do Adults Change their Mind after Reading Persuasive Text?", Written Communication 13 (1996), 291-313
2 - Lord, Ross and Lepper, "Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence", Journal of Personality & Social Psychology 37 (1979), 2098-2109
3 - Munro and Ditto, "Biased Assimilation, Attitude Polarization, and Affect in Reactions to Stereotype-Relevant Scientific Information", Pers. Soc. Psychol. Bull. 23 (1997), 636-653 
4 - Kahan, Jenkins-Smith and Braman, "Cultural Cognition of Scientific Consensus", Yale Law School Research Paper #205 

See also - Prasad et al., "“There Must Be a Reason”: Osama, Saddam, and Inferred Justification", Sociological Inquiry (2009)
Marks and Fraley, "Confirmation Bias and the Sexual Double Standard", Sex Roles 54 (2006), 19-26
LaMarre, Landreville and Beam, "The Irony of Satire",  International Journal of Press/Politics 14 (2009), 212-231


3.  We can't predict stuff

In the absence of having psychic powers, human beings are poor at predicting the future - worse than is generally realised.

It was in 1980 that the American academic J. Scott Armstrong advanced his "Seer-sucker theory": "No matter how much evidence exists that seers do not exist, suckers will pay for the existence of seers."[1]  Perhaps the most concerted evidence backing up this theory is presented in Philip Tetlock's book Expert Political Judgement, in which he obtained over 82,000 specific predictions about future political events from 284 experts.  Tetlock's rather disturbing conclusion was that experts enjoy no special ability to predict the future - which raises the question of how many uncountable millions of dollars, pounds and euros are being paid to such individuals to produce predictions when they might just as well be fed into the office shredder.  (To be fair, Tetlock did find that experts' judgment could be appreciably improved if they adopted a relatively open- rather than closed-minded approach to forecasting, but even then their performance was not particularly impressive.)

These themes have also been explored by Nassim Taleb, author of Fooled by Randomness and The Black Swan.

Notes

1 - Armstrong. "The seer-sucker theory: the value of experts in forecasting", Marketing Papers (1980)