Is uncertainty the new risk?

This paper, one of many on uncertainty in climate models, makes me wonder if scientists’ increasing willingness to talk about uncertainty is making it the new ‘risk’. According to Brian Wynne’s classic typology, there are four types of incomplete knowledge: risk – where we know the odds and the range of outcomes, uncertainty – where we don’t know the odds, but we know what sorts of things might happen; ignorance – the Rumsfeldian world in which we don’t know what we don’t know; and indeterminacy, where we can’t see any links at all between knowledge and implications.

Wynne has described tirelessly for more than twenty years how scientific cultures tend to rationalise uncertainty and ignorance in terms of risk. So issues that bring massive knowledge gaps, ethical conflicts and contested visions of the future are often presented as though they are narrowly about risk (think GM crops). The science policy world, prompted by people like Bob May, have begun to realise the limits of risk as a frame, and have gradually become more comfortable with the idea of uncertainty. But it seems as though this too is being scientised.

I watched much of this meeting last year at the Royal Society. It included people as diverse as Tim Palmer, Mervyn King, John Krebs and David Spiegelhater. But throughout the meeting, there was a surprising consensus that these uncertainties were relatively easy to solve through more research, better characterisation and increased computer power. Meryn King, for example, explained the credit crunch as an outlier event – but still within the range of probabilities (his talk is here – search for Royal Society). The fact that it almost fell off the edge of their probability fans (this sort of picture, below) was seen as nothing abnormal.

The scientisation of uncertainty presents huge problems for policy. It suggests that policy problems can be solved by throwing more science at them. And it assumes that current scientific trajectories are the right ones. People like Dan Sarewitz have argued that, with the climate debate, more science makes things worse not better. If we take uncertainties seriously as policy issues, we need to change how we think about issues – take action, not just commission more research, and prompt new sorts of insight.

Advertisements

About Jack Stilgoe

Jack Stilgoe is a senior lecturer in science policy at the department of Science and Technology Studies, University College London.
This entry was posted in Uncategorized. Bookmark the permalink.

15 Responses to Is uncertainty the new risk?

  1. Very interesting Jack, thanks. I think also uncertainty is just a euphemism for ignorance a lot of the time and as you say, carries with it the whiff off possibility that it may be solved, when it actually won’t. It also allows people to duck the difficult issues by asking for more science until it is no longer their problem. But of course by then more people with more different opinions have piled in, making the decision harder. This will of course get much harder as we supposedly involve the public in decision making processes and as I have said before, we may not quite get what we bargained for when/if we do.

    I have come to the conclusion that more leadership and better mechanisms for flexibility, changes of course and downright u turns are best. But of course that then looks like top down authoritarianism and we are back where we started.

  2. Spot on Jack – uncertainty is without doubt the new “risk” – especially at the frontiers of technology. And there is an increasing move for the science community to own this space. But there’s also little doubt that a much more integrative approach – involving all stakeholders – will be needed if responsive approaches to dealing with uncertainty emerge.

  3. STEM_Wonk says:

    I completely agree, there comes a point where wise decisions need to be made despite given the state of asymmetric information. Here’s a great podcast that I found does a great job explaining this term from economics and I can see it very relate to our actions on climate change: http://www.backsidesmack.com/2011/07/time-is-money-episode-6-adam-smith-did-not-invent-the-pin-factory/

  4. Another dimension that it might be useful for science policy and STS people to consider here involves the changing ways in which scientists think about probability, and in particular the recent growth in popularity of Bayesian approaches. The classical distinctions between risk, uncertainty and ignorance have a very different character if you are a Bayesian, to whom probability is a provisional and subjective quantity that is necessarily and easily modified by new information, than if you are a frequentist, for whom probability is an objective quantity with an operational, experimental definition. Bayesianism is an enormously powerful and seductive way of dealing with the problem of incomplete information, and its growth probably helps explain why scientists have become more comfortable talking about uncertainty, though I suspect it may be associated with some under-explored downsides.

  5. Jack Stilgoe says:

    Thanks all. I am interested that this is a picture that is recognisable from the scientific side as well. I hear terms like “Bayesian” thrown around, and I nod from time to time, but I’ve never got what it’s all about. I see a project that needs doing, for which you have all inadvertently volunteered.

  6. Adam Hyland says:

    Many thanks to STEM_Wonk for the link, but Asymmetric Info as such doesn’t apply here. What we are looking at is the intersection between modeling and risk/uncertainty. Most of the time we can only really distinguish risk from uncertainty (as defined above) ex post and then only as “model risk”–the likelihood that the model itself doesn’t capture all pertinent sources of contingency. Most of the time the discussion hasn’t moved beyond projecting the last few years of activity with some permutations on relevant inputs. See this NYT article for some dour commentary:

    http://www.nytimes.com/2007/03/04/business/yourmoney/04view.html

    Models like these have trouble with inflection points *and* volatility shocks. If you look at distribution of equity returns for the last 20 years you can pick out the crises from a mile out because distribution of returns spreads out (where under normal times, good and bad, it will be more tightly distributed).

  7. I definitely agree. If nuclear risk characterized the most recent generation of science policy scholarship, then climate change is the paragon of uncertainty.

  8. Pingback: Is uncertainty the new risk? | Management of complex systems and projects | Scoop.it

  9. Tim Johnson says:

    Firstly, Wynne’s typology looks very much like Knight’s (Risk, Uncertainty and Profit, I.1.26, 1921) and Knight’s distinctions cause mathematicians a lot of problems. For a mathematician (and the OED) a risk is a negative consequence of an uncertain outcome. To equate risk with uncertainty seems to be a driver for more scientific research; we need more data because data reduces uncertainty and uncertainty is bad. Knight’s work is interesting in that he identifies uncertainty as the driver of innovation; for a mathematician (and an entrepreneur) the opposite of risk is opportunity, not certainty. Knight’s work is important because he re-introduced uncertainty into the economic analysis of entrepreneurs.

    A general problem is that there is no consensus on what is meant by seemingly familiar terms, such as risk, uncertainty, probability, and expectation. If an economist talks of uncertainty, they may not be understood by a statistician. When a Bayesian talks of probability they mean something different to a frequentest, and as a (mathematical) probabilist, I think both those approaches are limited. An English mathematician talks of ‘expectation’, a French of ‘esperance’ (hope).

    However, language is a symptom of a deeper malaise – does science really understand uncertainty? It is canonical that Descartes created absolute space, Newton absolute time but what is less well appreciated was that simultaneously the concept of “absolute chance” emerged with mathematical probability. It has been noted (by Hald and others) that it it striking that all the pioneers of probability, Pascal, Huygens, the Bernoullis, Montmort, de Moivre and Bayes, were all Augustiniains of one degree or another – they believed in predestination (Newton and Leibnitz were not, and did virtually no work in probability). A consequence of Augustinian pre-destination was that nothing was ‘random’, it was just that humans had no knowledge of God’s plan. This theistic view was secularised with Laplace’s replacement of God with Laplace’s Demon. The message, however, was the same “truth is out there”.

    Knight’s great achievement was to identify that, in the economy at least, there was pure ‘chance’, events that were un-predictable, a point understood by Cicero who distinguished between the
    predictable (eclipses), the foreseeable (the weather) and the random (discovering a treasure). It is this insight that makes the financial markets so important to science, they provide the laboratory of ‘pure chance’, Cicero’s ‘random’ events, and it is the challenge to science whether it can create the tools to come to terms with this. Personally, I don’t think resorting to Baconian/Laplacian models of “more data, more calculations” will be the answer, but some serious research on characterising the problems, and developing the mathematics to describe them, needs to be done.

  10. Jack Stilgoe says:

    Thanks Tim,

    There’s a huge amount in there, which will take me some time to digest. But I really appreciate the depth of your explanation. ON your first point, about Frank Knight, you’re bang on. Wynne and Stirlling’s typology was inspired by Frank Knight and Keynes, both of whom punctured classical economic models by describing the things that we don’t know, and may never know. These had been dismissed as marginal by economists, or soluble through market action of research. Keynes and Knight showed that they were fundamental and wouldn’t go away.

  11. Pingback: Is uncertainty the new risk? | strategic learning | Scoop.it

  12. Pingback: Is uncertainty the new risk? | Teaching in the XXI century | Scoop.it

  13. Pingback: Climate science and the trial of the Danaids | Responsible Innovation

  14. Jack Stilgoe says:

    UPDATE: The papers from this discussion meeting have now been published at http://rsta.royalsocietypublishing.org/content/369/1956.toc. Some of them are even Open Access…

  15. Pingback: Last Night’s Cafe: The Future for Wales – Democracy » Cardiff Philosophy Cafe Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s