Exploring aspects of Research Culture: The Open Research Agenda

I've been asked to join a panel discussion about Open Research at the University of Bath. This will be part of a week, commencing June 20th, to celebrate and explore aspects of Research Culture at the University. Before this, I will give an overview of my views on the open research agenda and what it means for research. This is roughly what I will say: 

* * *

Open and transparent practices benefit individual researchers, groups and institutions. This is beyond doubt. While there is no one size fits all, any attempt to improve the accessibility of scientific papers, data, code or materials is a step in the right direction. 

Even small changes appear to have significant positive impacts in the long term. For example, simply putting papers on pre-print servers appears to reduce the likelihood of retraction.  Open research practices including the pre-registration of analysis plans can also act as a counterweight whereby the scientific enterprise in its current form can allow for a culture of misinformation.

Despite obvious benefits, I think there are three related challenges that prevent academics from fully capitalising on the open science agenda. 

Privacy vs Open Science

Although practices like sharing data are intended to improve the quality and ethical conduct of research, they are often in direct conflict with the need to preserve people’s privacy. For example, collecting large volumes of social media data is usually done without obtaining consent. 

The social sciences are also having to think carefully about issues of responsible innovation as many new methods can be weaponised. Smartphone applications designed to track and trace the spread of COVID-19 have, for instance, incited fears over the potential for mass surveillance, and many people have questioned how these data will be safeguarded against misuse or hacking, and what these applications could mean for human rights and freedom of movement.

Researchers, therefore, have to remain mindful of collecting, analyzing, and sharing digital data in an ethically appropriate manner, as the use of digital data generates opportunities for data misuse.

Beyond anonymisation, other solutions involve the generation of synthetic data that can mimic real data sets by preserving statistical properties (e.g., the relationships between variables). It does so without running this risk of accidental disclosure because no record in the synthetic data set actually represents a real person. Differential privacy is another computational approach that describes information about groups within a data set but withholds information about specific individuals.

Interdisciplinarity Disparities

Interdisciplinary challenges make it difficult to get a universal agreement on what level of transparency is acceptable. 

Put simply, some disciplines are further along the open research journey than others. Psychology, for example, has made significant improvements following a well-documented replication crisis. This is now reaching the stage where several journals have mandated open science practices including The Journal of Social Psychology and Behavior Research Methods. Dozens of other journals are emphasising the importance of such practices as part of any submission, but progress regarding data sharing remains slow even within psychology.   

However, this remarkable document from the UK Reproducibility Network showcases publications from multiple disciples where scholars are thinking deeply about the open research agenda or providing resources to support activities. It is easier than ever to engage with open and transparent practices. 

Indeed, it is worth noting that editors and funders have had more of an impact than institutions in terms of what is expected by researchers. That's not to say institutions have not fed into some of these decisions, but this is where most of the challenges for researchers now lie when it comes to ensuring that all science is open science. 

Institutional Expectations

As with interdisciplinary research, open research practices can often come at a cost for individuals. For example, while funders and institutions believe that interdisciplinary research is important for tackling complex problems, high-performing interdisciplinary researchers tend to be punished. They are often seen as challenging the distinctiveness of particular scientific fields. 

The same can also be true for those who threaten a model of reward that values the academic paper and little else. Just because most academics on Twitter are very much in favour of open science, a lack of awareness should not come as a surprise given the hierarchical nature of academia. 

Several universities have joined the UK Reproducibility Networksuggesting a strong commitment to open research. However, even with this commitment, reforms to improve reproducibility and quality must be coordinated across the research ecosystem

For example, any commitment to open research should map onto hiring and promotion criteria. While engaging with open practices naturally feed into other areas that benefit an individual's research (e..g, citations and impact), this might also involve looking at:

  1. How many publications can be read by members of the public?
  2. How many data sets are accessible by other researchers?
  3. Should researchers who are contributing to the development of open source software or tools that have already been published be rewarded for their efforts? 
  4. Should a reduction in the number of outputs be expected where open science practices have been adhered to (e.g., pre-registration)? While such practices generate higher-quality work these almost always extend the duration of the research cycle. Combine this with work that cuts across disciplines and the resulting outputs will take even longer to materialise. 
  5. Is engaging with or leading larger open collaborations acknowledged or encouraged (e.g., Psychology Science Accelerator)?

Beyond hiring and promotion, what might adequate support for these activities look like? Related institutional activities that reduce misinformation and highlight the importance of transparency and integrity will have to match up with any commitment. 

For example, when developing press releases, these might explicitly flag where the paper, data, materials, and code are freely available. If any of these are not available, is a press release still advisable given any institutional commitment to an open research agenda? 

Most work that challenges broken research cultures or drives open research practices has come from early-career academics who have now established themselves as leaders. Of course, support has sometimes come from higher up - such events like this would not be happening otherwise - but there is currently little motivation for individuals in many institutions to engage with these activities, which will often (rightly) slow down the publication machine.  

Summary

The push toward open science continues to gather momentum. The UK government has just published evidence following a consultation on transparent research practices and the argument for open research cultures appears to have been won.

But progress can be fragile. Many journals for example, appear to have zero interest in curating increased transparency as part of the research process.  

With few exceptions, institutions have been sluggish to respond. Researchers may struggle to meet the expectations of journals, funders and institutions. I suspect this is reflected in the number of people who are leaving academia because it is yet another set of barriers to navigate in order to demonstrate individual over collective 'value'.

Institutions remain fixated on the former. 

That said, I remain optimistic. Bath, for example, is full of academics who are developing strands of work that are both general to issues of research transparency, but also specific to their own individual areas of research. 

Perhaps they all need to shout a bit louder.

Comments

Popular posts from this blog

A universal skill-set for all psychologists?

State-Trait Anxiety Inventory (STAI): SPSS Script

Rosenberg self-esteem scale: SPSS Script