Go back

Open up evidence reviews

                  

Technique from design gives communities more say in research, write Tania Carregha and Samanthi Theminimulle

Social research is a well-guarded ivory tower. Access is mostly granted only to those from certain backgrounds, who have had certain opportunities. But this is changing, thanks to participatory methods.

Such methods aim to more equitably include the people otherwise subject to research. They have made social science more accessible to, and more focused on, people and communities that have previously been excluded from its world. 

But while more researchers turn their attention to methods that put participation and equity in the foreground, one type of study central to the discipline, the evidence review, has continued to largely be business as usual. 

Evidence reviews investigate and summarise what we know about a subject, what we do not know and what is difficult to establish. Their results inform outcomes such as programme design and policymaking, along with informing the design of primary research. The UK-wide What Works network—which has 14 centres working to ensure policies in areas including health, housing, education and policing are based on the best evidence—is an example of how influential the world of evidence can be.

However, the robustness of evidence reviews is often founded on a narrow idea of what counts as evidence. Researchers typically set the criteria for what is included, introducing bias. The work also takes a lot of time to complete, and paywalls to academic content can be a barrier. 

All this makes evidence reviews inaccessible to non-professionals and often means their results reproduce existing inequalities in knowledge. By reviewing and responding to existing evidence, usually produced by a limited few, evidence reviews create a self-reinforcing cycle of knowledge production that speaks to what already exists, rather than being informed by what people and communities think is important to them. 

Diamond in the rough

What would a more participatory approach to evidence reviews look like? How might it deliver a more open and equitable process and outcomes, while acknowledging that not everyone has the time or resources to take part in an evidence review? 

To try to answer these questions, the Institute for Community Studies (ICS) has borrowed an approach first devised by the UK Design Council to guide the design and innovation process, known as the double diamond. In this, we’re alternating an exploratory process to understand an issue widely and deeply with communities, with a process that creates space for communities to define the challenge and shape the outcomes of the research so that they are most useful to them. 

The ICS is putting this forward as a useful way to articulate and structure a more participatory approach to evidence review
for a range of complex issues. It is also a method for over-coming communication barriers in settings with high social complexity.

The ICS is using this approach in its Civic Journey programme, a two-year research and social action project exploring young people’s transition from adolescence to active citizenship. This was originally planned as a typical review, but engagement with more than 1,800 young people in the early phase of the programme challenged the researchers to think differently. 

The reformulated evidence review, which is now a key strand of the project, will pilot our participatory approach, demonstrating the transformative potential of research methods focused on young people. We hope the findings and key learnings from this pilot will inspire other practitioners to make social research processes more permeable and community-centred.

Widening evidence

Layering the process of an evidence review onto the double diamond makes it more accessible to participants from outside research, and allows those most affected by projects to guide them. It helps to identify and prioritise what people and communities see as important, without placing an undue burden on them to engage with a lengthy process. What counts as evidence can be viewed more expansively, and the review process reaches a wider range of people, including those who haven’t engaged with formal evidence in the past. 

Participatory approaches also prompt researchers to take a more critical view of existing evidence, helping to reveal what’s working and what’s not from practice and experience, as well as the more usual data and evidence. This should lead to outcomes that are more meaningful and useful. It has the potential to bridge the gap between knowledge, discourse and communities, bringing evidence closer to the people it affects. 

Tania Carregha and Samanthi Theminimulle are at the Institute for Community Studies

This article also appeared in Research Fortnight