Acknowledging AI’s dark side

This is the text of a recent letter published in Science, for those who can’t get behind the paywall
Science 4 September 2015:
Vol. 349 no. 6252 p. 1064
DOI: 10.1126/science.349.6252.1064-c

The 17 July special section on Artificial Intelligence (AI) (p. 248), although replete with solid information and ethical concern, was biased toward optimism about the technology.

The articles concentrated on the roles that the military and government play in “advancing” AI, but did not include the opinions of any political scientists or technology policy scholars trained to think about the unintended (and negative) consequences of governmental steering of technology. The interview with Stuart Russell touches on these concerns (“Fears of an AI pioneer,” J. Bohannon, News, p. 252), but as a computer scientist, his solutions focus on improved training. Yet even the best training will not protect against market or military incentives to stay ahead of competitors.

Likewise double-edged was M. I. Jordan and T. M. Mitchell’s desire “that society begin now to consider how to maximize” the benefits of AI as a transformative technology (“Machine learning: Trends, perspectives, and prospects,” Reviews, p. 255). Given the grievous shortcomings of national governance and the even weaker capacities of the international system, it is dangerous to invest heavily in AI without political processes in place that allow those who support and oppose the technology to engage in a fair debate.

The section implied that we are all engaged in a common endeavor, when in fact AI is dominated by a relative handful of mostly male, mostly white and east Asian, mostly young, mostly affluent, highly educated technoscientists and entrepreneurs and their affluent customers. A majority of humanity is on the outside looking in, and it is past time for those working on AI to be frank about it.

The rhetoric was also loaded with positive terms. AI presents a risk of real harm, and any serious analysis of its potential future would do well to unflinchingly acknowledge that fact.

The question posed in the collection’s introduction—“How will we ensure that the rise of the machines is entirely under human control?” (“Rise of the machines,” J. Stajic et al., p. 248)—is the wrong question to ask. There are no institutions adequate to “ensure” it. There are no procedures by which all humans can take part in the decision process. The more important question is this: Should we slow the pace of AI research and applications until a majority of people, representing the world’s diversity, can play a meaningful role in the deliberations? Until that question is part of the debate, there is no debate worth having.

  1. Christelle Didier1,
  2. Weiwen Duan2,
  3. Jean-Pierre Dupuy3,
  4. David H. Guston4,
  5. Yongmou Liu5,
  6. José Antonio López Cerezo6,
  7. Diane Michelfelder7,
  8. Carl Mitcham8,
  9. Daniel Sarewitz9,
  10. Jack Stilgoe10,
  11. Andrew Stirling11,
  12. Shannon Vallor12,
  13. Guoyu Wang13,
  14. James Wilsdon11,
  15. Edward J. Woodhouse14,*

  1. 1Lille University, Education, Lille, 59653, France.

  2. 2Institute of Philosophy, Chinese Academy of Social Sciences, Beijing, 100732, China.

  3. 3Department of Philosophy, Ecole Polytechnique, Paris, 75005, France.

  4. 4School for the Future of Innovation in Society, Arizona State University, Tempe, AZ 85287-5603, USA.

  5. 5Department of Philosophy, Renmin University of China, Beijing, 100872, China.

  6. 6Department of Philosophy, University of Oviedo, Oviedo, Asturias, 33003, Spain.

  7. 7Department of Philosophy, Macalester College, Saint Paul, MN 55105, USA.

  8. 8Liberal Arts and International Studies, Colorado School of Mines, Golden, CO 80401, USA.

  9. 9Consortium for Science, Policy, and Outcomes, Arizona State University, Washington, DC 20009, USA.

  10. 10Department of Science and Technology Studies, University College London, London, WC1E 6BT, UK.

  11. 11Science Policy Research Unit, University of Sussex, Falmer, Brighton, BN1 9SL, UK.

  12. 12Department of Philosophy, Santa Clara University, Santa Clara, CA 95053, USA.

  13. 13Department of Philosophy, Dalian University of Technology, Dalian, 116024, China.

  14. 14Department of Science and Technology Studies, Rensselaer Polytechnic Institute, Troy, NY 12180, USA.
  1. *Corresponding author. E-mail:
Posted in Uncategorized | 1 Comment

A tale of two trials

This story is, I think, an interesting example of Responsible Research and Innovation in action. It is a story of an institute and its researchers learning from controversy and from their own experiments in public dialogue. 

In the summer of 2012, a group of polite protesters assembled at Rothamsted Research, north of London, for what turned out to be an extremely polite protest. Earlier that year, a group had threatened to destroy a young crop of experimental wheat, genetically modified with the aim of repelling aphids. Rothamsted had, to their credit, seen the planting of this crop not as an experiment to be hidden from public view (even though they are legally bound to reveal its location and timing), but rather an opportunity to open up a dialogue about the pros and cons of genetic modification as a tool to help improve food security.

Rothamsted were keen to begin a formal public dialogue process focussed on the trial. But, as I and others argued at the time, to do so would have been disingenuous given that there was, understandably, no intention of changing direction or pulling up the crop. Instead, Rothamsted, in conversation with Sciencewise, sensibly chose a strategic, forward-looking dialogue exercise centred on the building of some principles for engagement with industry (a key condensation point for public concerns about GM crops). The exercise and its report informed a new approach, giving the institution and its researchers new confidence in public discussions about contentious agricultural research.

At the same time, Rothamsted found a new use for their high-security test field (the original GM wheat trial cost less than a million pounds, but security around the experiment cost more than two million). In the Spring of 2014, a crop of Camelina that had been modified to produce Omega-3 fatty acids for nutritional purposes, was planted. As with the GM wheat trial two summers earlier, approval had been granted by the UK’s Department for Food and Environmental Affairs, but this time the institute had decided to go public before the crop was in the ground. Rothamsted sent out a public consultation and invited various stakeholders, including local organic farmers and beekeepers, along to meetings the moment the application to run the trial was made. A concern from the beekeepers about the possible spread of pollen meant that Rothamsted went beyond the demands of regulators to put a net over the crop during its pollination.

This trial itself took place in the glare of public scrutiny. The BBC filmed the sowing, the flowering and the harvesting of the plant. This time around, however, there was relatively little antagonism. In summer this year, Rothamsted published the results of the wheat trial. As with many experiments, the results were negative. The crop didn’t repel its pests as hoped. Unusually for a negative result, the paper received huge coverage and allowed Rothamsted to communicate the message that not only were they doing genuine frontier research, but also that they were doing so in public, in the open.

Posted in Uncategorized | Leave a comment

New Paper: Geoengineering as collective experimentation

20219935I’ve just published a paper in the journal of Science and Engineering Ethics which gives a summary of one of the ideas in the book – technology as a social experiment – and develops it to discuss how we might think about the politics of conducting experiments in controversial areas of science. The paper began life at a fascinating conference hosted by a Ibo van de Poel, a philosopher in Delft running a large project looking at a range of technologies-as-experiments.

The paper is Open Access and it’s available here.


Geoengineering is defined as the ‘deliberate and large-scale intervention in the Earth’s climatic system with the aim of reducing global warming’. The technological proposals for doing this are highly speculative. Research is at an early stage, but there is a strong consensus that technologies would, if realisable, have profound and surprising ramifications. Geoengineering would seem to be an archetype of technology as social experiment, blurring lines that separate research from deployment and scientific knowledge from technological artefacts. Looking into the experimental systems of geoengineering, we can see the negotiation of what is known and unknown. The paper argues that, in renegotiating such systems, we can approach a new mode of governance—collective experimentation. This has important ramifications not just for how we imagine future geoengineering technologies, but also for how we govern geoengineering experiments currently under discussion.

Posted in Uncategorized | Leave a comment

Why so quiet?

This blog has entered a fallow period. I have migrated my blog writing to two other places. The first is the Guardian Political Science blog, which has been going better than I and the other editors feared, given how niche debates about science policy can be.

The second place is the blog of my new book, Experiment Earth. This book organises some of the ideas I have drafted here and goes deeper into the case study of geoengineering. I would, of course, love to know what readers think.

I may resuscitate this blog in due course but, for now, don’t expect much…

Posted in Uncategorized | Leave a comment

Responsible Research and Innovation in action

As policy interest in Responsible Research and Innovation grows, those who are new to the discussion rightly ask what it might mean in practice. How do we know it when we see it? What does irresponsible research and innovation look like? My response is perhaps a bit unsatisfying. RRI is a work-in-progress, as are science, politics and society more broadly. This means that RRI is necessarily experimental and open-ended. Responsible science and responsible technologies will not just show themselves. But we can look at experiments that are taking place at various levels in various places, to see how the rules of research and innovation may be rewritten in more responsible ways. Here is my starting list of 13 RRI things I have found interesting. This is not to say that all of these things are unequivocally good. An important part of RRI is the surfacing of differences and political clashes around all of these things. But they seem to me to be interesting developments.

  1. CAMBIA – Open source biotechnology
  2. The Bermuda Principles of the Human Genome Project
  3. Jonas Salk refusing to patent the Polio Vaccine
  4. Joseph Rotblat leaving the Manhattan Project and starting Pugwash
  5. The MHRA’s Yellow Card Scheme – now opened up to members of the public as what Sheila Jasanoff would call a ‘technology of humility’  
  6. Berkeley Earth – an attempt to address issues in climate science
  7. The Biobank UK Ethics and governance council –
  8. The Sciencewise Expert Resource Centre – running public dialogue for the UK Government
  9. CSynBi and Flowers – multidisciplinary Synthetic Biology Research
  10. EPSRC’s framework for responsible innovation
  11. GSK’s involvement in Patent Pools for neglected diseases  and open innovation
  12. The Alzheimers Society QRD network – involving carers and patients in research funding and management
  13. The SPICE project – an early set of technical and social experiments in the world of geoengineering research (see my, ahem, book)

This list is inspired by a project in which I and colleagues at UCL are involved, called RRI TOOLS. It aims to develop a toolkit for responsible research and innovation that can be taken up by scientists, policymakers, companies and others. The inevitable imperfection with such a project is that it will tend to emphasise processes rather than outcomes. This is why, as with the SPICE project, the Polio Vaccine, Pugwash and the Bermuda Principles, we should also pay attention to situations in which people have responsibility thrust upon them. Systems of research and innovation are as likely to be responsibly shaped by accidents as by intentional efforts to increase public engagement and force disciplines to work together.

If you are reading this and have suggestions for more, please add them. I hope this unscientific sample also prompts questions about the criteria for selection, beyond my main one, which is ‘interestingness’.

Posted in Uncategorized | 5 Comments

Governing emerging technologies 2013 Blog Award Winners

(This post is reprinted from the Guardian Political Science blog)

Every year, I teach a course for UCL undergraduates on Governing Emerging Technologies. Students from our department – Science and Technologies – join students from science degrees around UCL to think about technologies before they are set in stone. The course is an exercise in navigating uncertainty. There are few definitive statements to rely upon, and I ask students to be sceptical of claims that scientists, inventors, ethicists, policymakers or anyone else make about the future. As well as doing the usual essays, I also get them to blog about whatever aspects of emerging technologies they like.

Some of the results were brilliant, blending difficult sociological ideas with cutting edge science and first-class writing. As universities go quiet for August, I thought now would be a good time to highlight, and link to, my three favourite examples.

First up, Brandon Gleken, who was visiting for a term from the University of Pennsylvania. Brandon began with an interest in venture capital and innovative start-up. Over the term, he developed a critical angle and put some politics back into a debate that is often breathlessly enthusiastic. His post about “solipsistic startups” is a great case of using one strong idea to hold some important and difficult messages.

Secondly, Rosie Walters. Her blog did a brilliant job of retelling and updating some stories that are often repeated in STS. In particular, her take on feminism and technology, looking at the washing machine, is a far better introduction to that debate than you would find in most dry academic texts.

Finally, Philipp Boeing, who is already involved in the young science of synthetic biology, and came to the Governing Emerging Technologies course from a computer science degree. His blog is autobiographical, including reflections on social and ethical questions as part of his journey towards scientific research. His post on scientific and artistic freedom is an honest account of a perennial tension that a lot of practicing scientists feel.

These students have agreed for me to point people to their work. But, if you visit their blogs, remember that they are not experienced bloggers. They are blogging as part of a course requirement. Their work deserves a wider audience, and it deserves praise. Encouraging comments only, please.

Posted in Uncategorized | 2 Comments

Letter to Nature

(Led by Sam Evans from Berkeley, a few of us in the Science and Technology Studies community have written a letter to Nature in response to a recent comment piece on Synthetic Biology. As the paper is paywalled, I have pasted a version of it here). 


Synthetic biology: missing the point

Volker ter Meulen warns that if environmental groups and others exaggerate the risks of synthetic biology it could promote over-regulation, which he says happened for genetically modified organisms (See here). But the point of supporting synthetic biology is not about making sure that science can go wherever it wants: it is about making the type of society people want to live in.

In the United States, for example, the rapid and uncritical introduction of genetically modified organisms prevented debate on issues such as alternative innovation pathways, and the impact on biodiversity and pest resistance. Many believe that these issues would have been better addressed through earlier and broader public discussion of the uncertainties surrounding transgenic organisms (see  for example S. Jasanoff Designs on Nature Princeton Univ. Press; 2005).

In our view, ter Meulen trivializes the role of social scientists in suggesting that they could help the synthetic-biology debate by finding better ways to communicate what scientists think. He also implies that public concern over such technologies and their governance reflects only a failure to understand the science of risk assessment — but this ‘deficit model’ of public concerns has long been discredited (see A. Irwin and B. Wynne Misunderstanding Science? Cambridge Univ. Press;1996).

It is not unknown for scientists themselves to foster exaggeration and uncritical acceptance of claims, or to focus on anticipated benefits rather than on risks. This practice may be at the heart of wider public concerns about responsible innovation (see the report of the Synthetic Biology dialogue (pdf), for instance).


Sam Weiss Evans University of California, Berkeley, USA.
Sheila Jasanoff Harvard Kennedy School, Cambridge, Masschusetts, USA.
Jane Calvert University of Edinburgh, UK.
Jason Delborne North Carolina State University, Raleigh, USA.
Robert Doubleday University of Cambridge, UK.
Emma Frow University of Edinburgh, UK.
Silvio Funtowicz University of Bergen, Norway.
Brian Green Santa Clara University, California, USA.
Dave H. Guston Arizona State University, Phoenix, USA.
Ben Hurlbut Arizona State University, Phoenix, USA.
Alan Irwin Copenhagen Business School, Denmark.
Pierre-Benoit Joly INRA, IFRIS, Paris, France.
Jennifer Kuzma North Carolina State University, Raleigh, USA.
Megan Palmer Stanford University, California, USA.
Margaret Race SETI Institute, Mountain View, California, USA.
Jack Stilgoe University College London, UK.
Andy Stirling University of Sussex, UK.
James Wilsdon University of Sussex, UK.
David Winickoff University of California, Berkeley, USA.
Brian Wynne Lancaster University, UK.
Laurie Zoloth Northwestern University, Evanston, Illinois, USA.

Posted in Uncategorized | 1 Comment

UK-Brazil workshop on responsible innovation, 19-21 March 2014

I and some colleagues (Phil Macnaghten from Durham/Unicamp and Brian Wynne from Lancaster) have been given a grant by the British Council and Fapesp to run a workshop on ‘Responsible Innovation and the Governance of Socially Controversial Technologies’ in Brazil on 19-21 March 2014.

If you are based in the UK or Brazil, are an early-career researcher (less than 10 years since PhD), have something interesting to say about responsibility and technology and fancy a trip to Brazil, email me ( for an application form.

But you’ll have to be quick. The deadline is 6th December.

Here’s a recent paper that explains some of our thinking. And there’s more info on the workshop below…

Continue reading

Posted in Uncategorized | 2 Comments

Governing Emerging Technologies, Autum 2012 blog winners

I’ve been meaning to do this for a while. Last term I taught a course called ‘Governing emerging technologies’ for UCL 3rd year undergraduates. It had 24 students, half from my own department, Science and Technology Studies, and half from other parts of UCL. As well as the usual essay, I asked them all to put together a course blog, in which they would explore issues to do with emerging technologies. We talked about case studies and literatures in class, but the idea with the blog was that the students would dig into their own examples. I asked the creators of the best ones if they would agree to have theirs aired publicly. In no particular order… 

  1. Beilinda Li’s blog – Beilinda stylishly and brilliantly discusses issues such as transhumanism – from scientific, social science and artistic viewpoints, the demise of technologies and what that tells us about innovation, and the Unabomber’s trouble with technological optimism. 
  2. Kane Shenton’s blog – Kane takes, first, the implications for education of advances in computation; second, the unintended consequences of innovation in financial markets; and third, the debate about ‘technological unemployment’. All of these, as well as being academically fascinating, are also cutting-edge policy debates. 
  3. Bella Eacott’s blog – Bella’s focus is more on the ideas that might inform better governance of technology. She looks at the trouble with technological fixes, from artificial hearts to geoengineering; screening and over-diagnosis; and technological hype in biomedical research. 

Huge congratulations to them all. Given that this course was brand new, I had no idea what to expect. But I was delighted by these three blogs. Needless to say, there were other highly-commended ones elsewhere in the class.


Posted in Uncategorized | 1 Comment

A year (and a bit) in responsible innovation

I know it’s too late for one of those retrospective/prospective new year pieces, but here’s mine, prompted by Andrew Maynard’s recent mention of Responsible Innovation. (Apologies that this appears solipsistic. It is as much a diary entry as a blog post).

I’ve spent the last 18 months working on the idea of Responsible Innovation – what it might mean, where it might come from, how to know it when we see it and how to put it into practice. There is a tendency for researchers in any area to see their topic growing in importance as they give it more of their attention. But I am convinced that, over the last year or so, the small world of research policy has started talking about Responsible Innovation. The capital letters are important. As one of the people implicated in developing and selling the (capitalised) idea of Responsible Innovation, this excites and troubles me. I want my research to be used. I want my ideas to travel. But I wonder whether people are using a TED-talk version of the idea. I wonder whether, when people move the idea of Responsible Innovation into their world, they leave behind much of its necessary conceptual baggage and instead just use the banner. After all, who’s in favour of irresponsible stagnation?

I began in the summer of 2011, hired as a senior research fellow to work with a wonderful duo of Richard Owen at Exeter (a professor of Responsible Innovation, no less) and Phil Macnaghten at Durham, who I had worked with on nanotechnology for a few years. This was initially a holiday from my day job at the Royal Society, which I reflected on (again, solipsistically) in this paper. They generously kept my job open for me, but I got too attached to the freedoms of my academic holiday. So I never went back. I immediately spent a month on a holiday from my holiday as a visiting fellow at the Edinburgh Genomics Network, where I did this talk, among other things, looking at the links between Responsible Innovation and my previous work on upstream engagement.

The project that Richard and Phil had been funded for was a short six-month (this became nine months, then fifteen months) burst to develop a framework for Responsible Innovation that the Research Councils could use in their decision-making.  It had been prompted by the EPSRC’s involvement in a public dialogue exercise on Synthetic Biology. Rather than just shelve the report, EPSRC had admitted openly that, in the light of public concerns about the direction of Syn Bio, they needed to consider their own responsibilities as funders. Syn Bio had already been a sort of Responsible Innovation test case. The Centre for Synthetic Biology and Innovation had been set up as an experimental collaboration between scientists, engineers and social scientists.  And while opinions differed about exactly why it was happening, there was a shared interest among all of the participants in ideas of responsible development, whatever that might mean, and a shared willingness to engage constructively, rather than follow their US equivalent’s journey into acrimony.

I began talking to Syn Bio researchers and others funded by EPSRC who were all interested in making sense of responsible innovation in their own domains: Roboticists like Alan Winfield, who is rewriting the laws of robotics to reflect them back on researchers themselves; the FRRIICT team who are looking at responsible innovation in ICTs, bringing together socially-minded technologists with technologically-minded others; Richard Jones at Sheffield – a rare Fellow of the Royal Society that is given to quoting sociologists of science and occasionally delivering brilliant talks on responsibility and nanotechnology.

In May 2011, I chaired a session at this conference, at which the European Commission announced their intention to redirect their science and society efforts to something called ‘responsible research and innovation’. The concept had been elucidated by René Von Schomberg and quickly moved into policy reality. For many of us who had been involved in arguments for public engagement with science for years without much policy purchase, the conference seemed to provide a clarity of purpose. I was asked to join a group at the Commission to develop a policy for Responsible Research and Innovation. We’ve done our bit. Let’s see how that goes. 

(The book that begun with conversations at this conference, containing thoughts from Rene, Arie Rip, us and others, will be released in 2013).

In January 2012, courtesy of the UK Foreign Office, we visited colleagues from Arizona State University at their DC office. ASU people such as Dave Guston, Dan Sarewitz, Erik Fisher and Jamey Wetmore have been thinking about responsible innovation in a US context for almost a decade. And as part of a massive National Science Foundation Centre for Nanotechnology and Society, they have been experimenting with making it happen by injecting social scientists into research labs, holding deliberative exercises and more.

In April 2012, the Danish Government took up the term for the conference on science and dialogue that they set up as part of their EU presidency. I was asked to be the rapporteur. Here’s the report. I was delighted to hear the Danish Science Minister talk about the need to move from creating the ‘best science in the world’ to the ‘best science for the world’. It will be a while before a UK Science Minister is brave enough to agree.

At around the same time, a particular issue arrived at the door of another Research Council. Anti-GM protesters had threatened to tear up an imminent trial of GM Wheat at Rothamsted Research, a large BBSRC facility in Hertfordshire. I was asked to advise them on whether a public dialogue exercise would be a good idea. My conclusion is that it would have been disingenuous. In the end, the protest was a damp squib, but questions of responsibility in innovation remained in the air (for longer, we were assured than GM wheat pollen).

Another major science governance story of late 2011/2012 became the centrepiece for our Responsible Innovation thinking.  The SPICE project – one of the world’s first big research projects in the controversial area of geoengineering – became a test case for Responsible Innovation. The SPICE team have been not just receptive but hugely proactive in helping me work out what a responsible approach to geoengineering research might look like. And the project has become a really important example of why even apparently harmless research projects can raise deep questions about ethics and responsibility. This has no doubt helped our work to get some sort of traction within the Research Councils. The report that we wrote for EPSRC, which we will publish soon, was taken to EPSRC council and will hopefully manifest in some important procedural changes there.

Looking ahead to 2013, I have a new job as a lecturer at UCL. I’ve always thought that you don’t really understand something until you’ve taught it, so I’m seeing whether Responsible Innovation makes sense to students. We’re one of the only places in the country that teaches undergraduate Science and Technology Studies, and we’re creating new MSc programmes from September. Exciting times.

I’m delighted to say that ESRC have agreed to fund more work with SPICE, so I’ll continue to work on geoengineering. Meanwhile, Responsible Innovation is flourishing elsewhere too. The European projects funded from the first wave of proposals will be gearing up. The Centre for Synthetic Biology and Innovation has become the even larger Flowers consortium, and the social scientists have been asked to “embed the principles of responsible innovation in translating the research into impact”. Good luck to us all. 

Posted in Uncategorized | 4 Comments