The future does not exist, at least not in the same way the past exists. From an evolutionary perspective, one might say there is no future in looking too far ahead. And perhaps not surprisingly, we are not very good at looking into our own distant future.
In human infants, for example, the concept of the future takes much longer to crystallise than the concept of the past. Memory develops well into the teenage years, but children already “get” the past at around two years old.
When we ask a four-year-old what they are going to do next week, let alone when they grow up, they lean on “script-based” knowledge (“it’s my birthday next week, so I’m going to get presents!”). Even as adults, we have no clear script for the future.
This is one of three fundamental problems that make it hard for us to sensibly think about nuclear power, whether we support or oppose it. Our natural difficulties thinking about the future, probabilities and risk all contribute to problematic attitudes towards nuclear power.
The triple crown of cognitive challenges
I had close contact with the mindset of both sides of the nuclear argument when I worked as a young journalist with a strong environmentalist leaning. And again later as a researcher working on a British Nuclear Fuels-funded project at the University of St Andrews, looking at our willingness to pay to avoid the relatively tiny probability of a particular nuclear accident.
Not surprisingly, I began a PhD on self-deception at St Andrews, impressed by how invisible our cognitive shortcomings are to ourselves.
My research gave me an understanding of the thick lenses we need to peer into the future as well as our completely ingrained inability to understand probability, which is a key concept in understanding nuclear power. It’s a shortcoming common to experts and lay people alike, and one that can’t be completely eliminated by education.
Future tense
Nuclear power involves the future in ways that fossil fuels don’t. In the 1990s, four interdisciplinary teams of scientists were assembled to look at the possible risk of future human intrusions into nuclear waste in the “deep future”, as far as 10,000 years hence – less than half the 24,000 year half-life of plutonium.
Amongst the themes the four groups debated was how to avoid the future loss of memory about nuclear waste. Centuries after we bury nuclear waste, will we remember why we buried it, let alone where we buried it?
Amongst the scenarios they canvassed was a team of treasure hunters hearing myths of something valuable buried in New Mexico, delightedly discovering ancient “warnings” like skulls and cross-bones as a sure sign they are on the path to buried treasure.
We find it hard to imagine the future or keep it in mind. Prospective memory is the ability to remember to do something, a field much less well understood than retrospective memory.
To remember to put the rubbish out – nuclear or otherwise – is something that takes time to develop. And it appears to take greater cognitive effort than remembering the past. However, prospective remembering can be sharply enhanced by associating successful prospective remembering with personally relevant rewards for remembering to remember. And that’s part of the problem.
The scripts that we can develop about the future take more effort and are far weaker than the ones we develop about the past. Prior to an earthquake, it’s hard to sell earthquake insurance. But sales boom after the crisis and taper away only gradually afterwards.
We do not treat our future selves as well as we treat our current selves, let alone future generations, simply because they are less real to us. Temporal discounting is the tendency to value rewards/punishments less the further they are in the future. It is powerful even in the space of one’s own lifetime. Even teenagers wouldn’t smoke if the chance of lung cancer was shifted to their early twenties.
Risky thinking
Another aspect of our thinking that can muddle our thoughts about nuclear power is the human capacity to hope. Take a typical study of how we perceive threat. A small 1986 examination of American attitudes toward the threat of nuclear war, for example, found that less than a quarter of subjects were optimistic about the threat to the world.
However, oddly, 50% predicted peace for North America, while an even more comforting 80% regarded their own chances within America as better than average.
Flip the issue of risk around to positive issues – the chance of winning a lottery prize, for example – and the very opposite pattern occurs. We stubbornly believe in magic even when we know it isn’t true.
The work of Daniel Kahneman and Amos Tversky supports the notion that risks and opportunities are perceived differently, and that experts and lay people alike suffer from these biases, and much of their original research was based on pools of experts, not the general public.
Probably wrong
This is partly about motivation: we do want to believe in positive things about ourselves, and don’t want to believe negative things. A more fundamental problem about the issue of risk is that we also do not understand probability well. Whether you conduct experiments with people or rats, probability impacts on us like uncertainty.
It is not only a problem of humanity’s propensity for hope. Studies have found that even those with proficiency in probability theory consistently behave as if they misunderstand low probabilities.
“Erroneous intuitions resemble visual illusions in that they remain compelling attractive even when the person is fully aware of their nature,” wrote Kahneman and Tversky on the stubbornness of expert fallacies. The reason why even experts can’t rationally use probability when it comes to the future is that it would be psychologically counterproductive to do so.
Depressed individuals are known to have more accurate self views because some aspects of the future are absolutely true yet downright depressing, such as the prospect of dying.
So thinking about nuclear power evokes the perfect cognitive storm of considering probability about risk in the distant future – three things we do not intuitively understand well.
This does not mean we should leave decision making about nuclear power to the experts. It means that we should monitor even experts very carefully to ensure they are not making horrible, but very human, mistakes on our behalf.
Olav Muurlink is Senior Lecturer, organisational behaviour, management at Central Queensland University.
This article was originally published on The Conversation. Read the original article.