Arma 2021: conference hears call for fixing research culture by making the system more open
Universities and researchers must publish their evaluation and research data openly if they are to improve the research culture within academia, the 2021 conference of the Association of Research Managers and Administrators has heard.
Meeting to discuss the role of evaluation practices in research culture, a panel of experts agreed that traditional metrics—determining individual and institutional value by the number of papers published, for example—did little to cultivate a healthy working environment.
A tendency to rely on these traditional metrics, the impact of Covid and increased pressure on researchers to perform well meant the sector had seen a lack of progress in recent months, warned Elizabeth Gadd, head of research operations at the University of Glasgow.
“It’s not a surprise that, as a sector, we have a significantly higher mental health problem than the general population, something only exacerbated by the pandemic and mitigating the differential impact of the pandemic in our evaluations,” she said.
Saskia Walcott, director of the higher education consultancy firm Walcott communications, said that greater emphasis on qualitative data by research councils was a positive step forward in driving fairness and diversity in evaluation. But in terms of how to use qualitative data in measuring and assessing quality, there was “no simple answer”, she said.
From her own research on the subject, Sophie Collet, director of strategy, information and operations at the University of Bristol, said she had found “openness and transparency” to be “critical [in reducing] anxieties about the way information is being used” within academic workplaces. Taking an open approach to all aspects of research, including evaluation, would therefore help to promote “positive environments”, she added.
Reproducibility crisis and open access
Following on the same theme, Malcolm MacLeod, professor of neurology at the University of Edinburgh, expressed concerns about a growing reproducibility crisis that has been well documented within research as a consequence of the pressure on researchers to produce headline results. But his concerns were that the “crisis has grown because… research hasn’t been evaluated enough, because if it had been evaluated it would have been improved”, he argued.
If universities “can move towards measuring the quality of research processes, as much as the research itself… if we can evaluate that researchers have systems in place by which they ensure the quality of the research themselves”, then the culture will improve, he added.
One simple way of measuring this, he suggested, might be to monitor whether or not researchers make their work available through open-access publishing methods. “I know that measures of research output are things which can have many influences, but certainly measuring the proportion of an institution’s research outputs, which is available open access at the push of a button by anyone anywhere in the world, is a fair measure of what [they’re] interested in.”
Asked what one thing might improve the research culture going forward, Gadd later suggested that evaluators should require “all outputs to be published on a national publication platform”, rather than in certain journals, which could “level the playing field”, she said.
Research Professional News is the official media partner for Arma 2021.