As biotechnology advances, the risk of accidental or deliberate misuse of biological research like viral engineering is increasing. At the same time, “open science” practices like the public sharing of research data and protocols are becoming widespread. An article in the open access journal PLOS Biology by James Smith and Jonas Sandbrink at the University of Oxford, UK, examines how open science practices and the risks of misuse interface and proposes solutions to the problems identified.
The authors grapple with a critically important issue that emerged with the advent of nuclear physics: how the scientific community should react when two values – security and transparency – are in conflict. They argue that in the context of viral engineering, open code, data, and materials may increase the risk of the release of enhanced pathogens. Openly available machine learning models could reduce the amount of time needed in the laboratory and make pathogen engineering easier.
To mitigate such catastrophic misuse, mechanisms that ensure responsible access to relevant dangerous research materials need to be explored. In particular, to prevent the misuse of computational tools, controlling access to software and data may be necessary.
Preprints, which have become widely used during the pandemic, make preventing the spread of risky information at the publication stage difficult. In response, the authors argue that oversight needs to take place earlier in the research lifecycle. Lastly, Smith and Sandbrink highlight that research preregistration, a practice promoted by the open science community to increase research quality, may harbor an opportunity to review and mitigate research risks.
“In the face of increasingly accessible methods for the creation of possible pandemic pathogens, the scientific community needs to take steps to mitigate catastrophic misuse,” say Smith and Sandbrink. “Risk mitigation measures need to be fused into practices developed to ensure open, high-quality, and reproducible scientific research. To make progress on this important issue, open science and biosecurity experts need to work together to develop mechanisms to ensure responsible research with maximal societal benefit.”
The authors propose several of those mechanisms, and hope that the research will spur innovation in this critically important yet critically neglected area. They show that science cannot be just open or closed: there are intermediate states that need to be explored, and difficult trade-offs touching on core scientific values may be needed. “In contrast to the strong narrative towards open science that has emerged in recent years, maximizing societal benefit of scientific work may sometimes mean preventing, rather than encouraging, its spread,” they conclude.