Taking the risk of getting it wrong in a pandemic

8 October 2020

By Rachel Tuffin (Director of Knowledge and Innovation of the College of Policing What Works Centre for Crime Reduction)

During the pandemic, key questions about liberty and responsibility have been contested internationally. How much should be left to individuals’ discretion to do the right thing? How much should the state do to control people’s behaviour? Social scientists can help by sharing the evidence, but may sometimes be cautious about drawing out the implications for decision-making.

Over the past six months, in the College of Policing, the What Works Centre for Crime Reduction, we have been involved in discussions on how to implement new legislation as it was being developed, debating what is the best way to ensure compliance – just how much enforcement does there need to be? In a situation where you don’t have clear existing evidence as to how a lockdown is best policed, how do you balance individual liberty with the need to protect the majority and prevent spread of disease? Or to put it another way, how do you balance the right to life with the right to liberty and security?

In this context, we’ve been asked to produce guidance to very tight timescales and, as an evidence-based organisation, we’ve had a recurring issue to consider: how far should we take risks in interpretation and judgement. We’ve needed to balance the risk of causing harm with a recommendation for action, against the risk of failing to reduce harm because we were not prepared to give a clear steer.

The first evidence challenge given the speed at which everyone needed to act is obvious – there was no time for new research or even thorough reviews. The usual timetables for the development of policy and legislation for policing and crime were blown away. But when there’s no obvious specific pre-existing evidence, perhaps we have to be pragmatic, asking what can be taken from the existing literature, robust theoretical frameworks, and evidence in similar contexts.

With a clear end point in mind – keeping people safe and reducing demand on the NHS by ensuring people would voluntarily go along with the lockdown – the evidence on compliance with the law was the obvious place to look. Some of the most compelling work is based on a theoretical framework known as procedural justice, developed by social psychologist Tom Tyler, and adopted and tested by criminologists in the UK, US and Australia.

Procedural justice suggests that if the law is applied fairly and people are treated with respect, they are more likely to comply in future because they see the exercising of the law as legitimate, no matter what the outcome of their personal encounter. A broad underpinning principle like this had the potential to make guidance helpful, even when there was no time to explore the potential for ambiguity and error in interpretation of the legislation, which is often where guidance is most useful.

This framework had been applied in previous research on legal compliance, with perhaps the most closely applicable context being police stops. When people were told why they were being stopped, were given a rationale, right of reply, and were treated with respect, they reported being more likely to comply with the law in future. This finding has also been borne out in other policing research – overall people were more likely to be satisfied with a police encounter if a procedural justice approach was followed.

Procedural justice findings certainly seemed applicable to the COVID-19 situation and their relevance was supported by rapid social science by LSE and UCL on compliance, based on online panel surveys within a few weeks of the start of lockdown.

The other evidence challenge was accessibility. What we know can be detailed and nuanced, but in policing as in many other operational contexts, if guidance is to hit the spot it needs to be succinct and as clear as possible. Working closely with operational staff is key to getting the material in the right format.

In this case, operational police came up with an easy to remember summary, the 4Es – engage, explain, encourage, and only if necessary, enforce. This model clearly fitted with the evidence base on building legitimacy through procedural justice – giving people a voice, explaining reasons for action, and encouraging them to comply should mean they would be more likely to see police action as legitimate, follow police directions without the need for enforcement, and to comply in future.

Overall, using this procedural justice-based ‘4Es’ approach to underpin the guidance was well received by commentators. We carried out cohort surveys to track how the guidance was being received by officers and used, as well as checking on wellbeing issues. Analysis of data is still ongoing to track variations in practice across forces, and understand the extent of enforcement against people from different backgrounds.

Some of this experience has been uncomfortable. Our What Works toolkit was painstakingly developed, and our standard evidence-based guidance processes, including evidence reviews, can take over 12 months, and include detailed stakeholder engagement and formal public consultation. This new role meant interpreting legislation in the context of existing evidence and practice within hours and days  and therefore taking risks by cutting our time for consulting on where guidance would help most and checking how it was understood.

Without the time to reach consensus on content, and with very different contexts across counties and countries, there were instances where some felt we went too far, and others felt we had not gone not far enough. There were ambiguities and errors in how the legislation was interpreted which couldn’t be avoided. We used these experiences to develop and agree principles to narrow down when and how we would issue guidance in future, for example sticking as closely as possible to the operational need, the law and the evidence base.

All of our usual painstaking effort and care is of course in vain if evidence doesn’t get used in practice. In this case, the urgent need meant that every new piece of work was being snatched out of our hands, in a digital sense, as soon as it was finished. We could be confident it was getting used, despite one or two highs and lows in how it was received.

We’ve seen many examples of scientists sticking their necks out during the COVID-19 crisis, and being prepared to suggest a course of action alongside the risks involved. In doing so, they have not confined themselves to statements mired in caveats. Those of us working in the social sciences need to stay willing to share judgements based on our interpretation of what will always be incomplete evidence. If we don’t, interpretation is left wholly to policy and practice decision-makers, who may be less likely to engage with the evidence if its implications and resulting risks are not clear. When scientists don’t bear some of the risk in this way, it can mean the evidence will simply be left to one side when the decision is made to take action, or not.

Even when it might seem as if we have no directly-relevant evidence, we can find ways to apply principles and evidence from what we do know to new contexts. If we work with policy and operational colleagues to make sure we all understand the implications, we can take risks and communicate those risks in our advice, ensuring that the available evidence is taken into account.

Visit the hub of the social science community’s response to COVID-19.

Rachel Tuffin OBE is Director of Knowledge and Innovation of the College of Policing. The College of Policing is the What Works Centre for Crime Reduction, established in 2013. The College collates and shares research evidence on crime reduction and supports its use in practice.  It is part of the national network of What Works Centres created to provide robust and comprehensive evidence to guide decision-making on public spending.

The perspectives expressed in these commentary pieces represent the independent views of the authors, and as such they do not represent the views of the Academy or its Campaign for Social Science.

This article may be republished provided you place the following statement and link at the top of the article: This article was originally commissioned and published by the Campaign for Social Science as part of its COVID-19 programme.