“This collaboration gives Wellcome the information it needs to push forward on its commitment to more robustly report research outcomes”

David Carr, Programme Manager for Open Research at Wellcome

In the last 10-15 years many funding agencies, in the federal and private sector, have passed public access and data sharing policies for funded research. Funding agencies worldwide understand the importance of broad, public access to research results as a mechanism to advance scientific knowledge, increase research transparency, and support research integrity. While the policies are strong, infrastructure to conduct compliance checking and track research outputs requires significant investments of people and/or significant technology development. Thus, the impact these policies have on the frequency of open access and sharing outputs, is not easily determined. 

Dimensions Research Integrity can provide funders with a report of the frequency of well-established reporting guidelines from published articles and research outputs. In September 2020, Ripeta – the algorithm and source of data behind Dimensions Research Integrity, partnered with Wellcome to do a retrospective analysis of the change in public access and data sharing in 2016 and 2019. David Carr, Programme Manager, from Wellcome shared with us his feedback about the usefulness of the collaboration.

Maximizing the value of research outputs through openness

Wellcome is committed to promoting rigour and integrity in the research that they support – this means making the data and software underlying research findings accessible for scrutiny and replication is fundamental. 

As part of their policy on managing and sharing data, software and materials, they have an explicit requirement that data, software and materials underlying publications should be accessible to other researchers.  Their new Open Access policy, which came into force in 2021 required the inclusion of a statement in all funded publications which describes how underlying data and code can be accessed.

To spearhead its work to promote openness to research outputs, a dedicated Open Research team was established to seek ways to monitor changes in practices and attitudes to openness over time.  

Prior to the collaboration, they hadn’t been able to track the extent to which researchers are meeting requirements to provide access to data and code underlying research finding.  Now they have an opportunity to gain an insight into this, and to look at whether the picture had changed over time. 

“Specifically, we were curious to see given the increasing focus and momentum of open science over the last few years (including Wellcome’s own policies and activities) how far the situation was changing in practice.

The general message for us whilst that there was evidence that the situation had improved (particularly in terms of data availability statements) there was still a very long way to go (still less than half of papers we funded have such a statement).

The fact that the proportion of data availability statements that indicated data was deposited in a repository was relatively low and unchanged at around 35% was a really valuable insight.”

They were also surprised to learn that the proportion of the papers indicating software availability had also increased but was still at a very low level (8%). And the relatively low proportion referencing the identity of the analysis software they used was a useful finding, as a key requirement for reproducibility.

Carr explained that, “I think that inclusion of data and software availability hadn’t risen slightly higher, and that the proportion of data availability statements including repositories hadn’t risen at all.  Also, I had assumed inclusion of the identity of analysis software would be standard good practice – that it’s absent in under half of papers was a bit of an eye-opener.”

The team are now able to include an annual progress stock take, and more detailed guidance on data and software availability statements, including some good practice examples.  This includes making clear that a generic “data available on request” statement is not good practice.

Build trust in research

Would you like to learn about how Dimensions Research Integrity can support research integrity within your organization? Get in touch and one of our experts would be happy to speak to you.