Years ago, I got into a lengthy, but pretty pointless e-mail argument about climate science. On paper, I had no chance. My opponent was a pre-eminent chemist with a lengthy publication list and a reputation for debunking bad science in his field – including exposing a high profile 'breakthrough' which made the mainstream media. But, the debate soon settled into an unexpected pattern, the inverse of what you would expect.
He, the scientist, would challenge me using unsupported 'evidence' copied and pasted from right-wing libertarian US websites (his own politics were firmly left of centre) and I, the layman, with much needed signposting from SkepticalScience.com, would come back with peer-reviewed research which debunked his debunking. Eventually, he half-backed down with a much caveated admission that maybe, just maybe, carbon emissions were driving long term climatic trends.
The question that has bugged me ever since is "How could someone so clever be so dumb?" Reading Nobel-prize winning psychologist Daniel Kahneman's superlative book Thinking, Fast and Slow this summer has answered my question – and it's all about how our brains really work.
Sciences such as chemistry are carried out under very controlled laboratory conditions. After years of working in a particular field, practitioners build up a strong intuition for what is probable and improbable (the 'fast' thinking of the title). They can generally trust their intuition as long as they keep to what they know.
More unpredictable, real world, highly complex issues like economics and politics (and, almost certainly, climate) cannot be judged by hunches – 'expert' pundits in these fields are wrong in their 'off the top of the head' predictions far more often than they are right. Kahneman argues that in such disciplines even the simplest mathematical model based on data from past experience (ie slow, analytical thinking) will comfortably out-perform expert intuition.
My colleague had got himself into a vicious cycle of trying to back up his intuition by grasping at anything, no matter what its provenance, that supported it. This is classic 'confirmation bias' – where instead of the analytical part of our brain keeping the intuitive part in check, it tries to find evidence to justify the hunch. His biggest mistake was backing his scientific intuition over the knowledge of others (those climatologists I was quoting) in a field he knew little about – a little humility would have saved him a lot of embarrassment.
I've believed for a long time that psychology is the missing piece in the sustainability. Nothing will change unless people start making different decisions – whether that's choosing to recycle a cardboard box at home or setting ambitious national climate targets. And if you want to encourage people to make different decisions, you have got to learn more about how they make those decisions and what can possibly change them.
I can't adequately summarise Kahneman's book here, except to say that it will change the way you think about how other people think. For example, if you try to force change on someone, their brains will exaggerate the downside and ignore the benefits. However, if people come to that conclusion by themselves, that flips around – they exaggerate the benefits and downplay the risks. I have made a career out of doing this – facilitating change rather than proposing it – and now I know why it works!