Portrait of Félix Fénéon by Paul Signac, 1890, Museum of Modern Art, New York City

Portrait of Félix Fénéon by Paul Signac, 1890, Museum of Modern Art, New York City

“Systems thinking, interdisciplinary working, uncertainty…that’s our life!” was how Claire Moriarty, Permanent Secretary at Defra, summarised the complex context to using science in policymaking during a two day conference on the subject hosted by the Royal Society.

One of the responses to this complexity is good evidence synthesis, with many government speakers calling for scientific advice that drew on a wide range of evidence.

So what did the conference reveal about what policymakers want from scientists, and what were the implications for synthesising evidence in ways that meet policy needs?

  1. The role of evidence in policymaking
    One of the more challenging messages for a conference about the role of science in policy was the reminder that scientific evidence is just one of several considerations that politicians base policy decisions on. The others were summarised as ‘deliverability’ and the values, both personal and political, of whoever is making the decision. Deliverability was never precisely defined but related to whether a particular policy is affordable, enforceable, and proportionate to the problem it is addressing. Personal and political values are at the heart of deciding trade-offs, such as the extent to which environmental impact should be prioritised over immediate economic impact.
  2. The role of scientists in policymaking
    The discussion of values highlighted a particular challenge for scientific advisers – how to separate their own views from their analysis of the evidence. Throughout the conference, various standards for experts were suggested, with consistent themes of independence, impartiality and rigour. Rigour was summarised by Sir Mark Walport, Government Chief Scientific Adviser as reflecting the totality of available evidence and acknowledging areas of uncertainty. Experts should not offer opinions on competing value sets. But reflecting the totality of evidence begs the question of what counts as evidence.
  3. Types of evidence
    Many of the policymakers were keen to incorporate a wider variety of perspectives and types of evidence in policymaking. This goes beyond the different perspectives offered by the natural and social sciences (interestingly categorised by Gemma Harper, Chief Social Scientist and Deputy Director for Marine Policy and Evidence at Defra, as “evidence” and “analysis” respectively), to include information on public attitudes, the experience of practitioners and even anecdotes. So in a discussion on the future of land use, Sarah Church, Director of Food and Farming at Defra, urged a greater engagement with land managers. Similarly, Professor Corinna Hawkes, Director of the Centre for Food Policy and Co-Chair of the Global Nutrition Report, argued that public attitudes towards food should have a greater bearing on decisions on food policy.

Considering these three themes together, a number of questions for evidence synthesis emerge.

The first is whether there are the same evidence needs at every stage of the policymaking process. The first step towards any policy is the rationale for why government action is needed. It’s conceivable that at this initial stage the case could be made on the basis of one type of evidence, epidemiological evidence on the health effects of poor air quality for example. Integrating other sources of evidence, such as the economic effects of banning diesel cars from city centres, would then become more important as the policy process moves from deciding something should be done to agreeing what should be done.

This then raises the question of who should carry out this integration. There’s an apparent tension between asking experts to restrict their advice to their area of expertise whilst also asking for a greater range of perspectives and types of evidence to be included in evidence synthesis. One suggestion made at the conference was that research councils should play a greater role in delivering evidence synthesis by identifying gaps and funding work to fill them. Whilst this would help provide the means, it doesn’t resolve the questions of what skills are required to integrate evidence and analysis from a range of different disciplines and how do you weigh different inputs developed using very different methodologies.

As part of the Royal Society’s policy work, we are beginning to think seriously about evidence synthesis, which should help to answer some of these questions. If you would like to be kept informed of this work, please contact Sarah Giles.

This is one in a short series of blog posts summarising the recent ‘Science for Defra: excellence in the application of evidence’ conference, held at the Royal Society on 29 and 30 March. For more conference outputs see the conference event page and keep your eye on In Verba for summary blogs of the other sessions by our Fellows.