Confirmation bias – a BIG problem

  (12 January 10)

How do we respond when confronted with information that agrees with what we already believe – think about that – how do you? If you’re like me you take it in quickly and smile inwardly. "I’ve been confirmed in my thinking".


What about when the information is against what you already believe – recall this happening – how did you respond? If you’re like me you reject it or forget it or say it’s wrong, and feel a bit dark about it. "That’s not what I wanted to read".


The climate change debate is one area in which this phenomenon is almost epidemic. Most of us have a point of view but two types of argument – for and against – appear regularly before us.


Kevin Dunbar has been studying how scientists REALLY behave in the laboratory and his research has revealed some interesting things about a part of our brain called the dorsolateral prefrontal cortex that has evolved to suppress, yes suppress, incoming information that we don’t want to hear. Wired magazine has an excellent article, quite readable, about Dunbar’s research.


Confirmation bias is a BIG problem for us.


This raises the question of how we ever manage to change our views on a subject. My own views, for example on the existence of 'God', have changed considerably over the years, but that was a slow change based on the gradual accumulation of ideas/facts/arguments. In the end, maybe it wasn't as rational a process as I like to think...

Posted by Greg Spearritt

I have to add, though, that the Wired article does reinforce the value of genuine science. Yes, scientists are people and have their own biases and filters, but the method of peer-review and an emphasis on repeatable observation and experiment means that in the long run the chances are good that the (always provisional) truth will come out - as it finally did for the scientists mentioned in Wired.

As physicist Robert L. Park (Superstition: Belief in an Age of Reason) puts it: "Much of the work of science consists of refining the methods of observation to avoid being deceived, including self-deception."

When 'scientific' arguments and evidence (e.g. on climate change) are put online rather than in peer-reviewed material, that's when there's a real risk that deception, intentional or otherwise, won't be picked up.

Posted by Greg Spearritt

Leave your own comment...

Security Code:

Search Site

Contact Us!

14 Richardson St
Lane Cove
NSW   2066