The Department of Health and Human Services’s proposed rules would needlessly stifle essential U.S. research.
The Department of Health and Human Services (HHS) is currently considering proposed rules for scientific research that, if adopted in their current form, would severely handicap America’s economic competitiveness and its ability to prevent and treat infectious diseases. The proposed federal rules were suggested by civilian advisers to address certain types of research conducted on viruses and bacteria that cause human disease. Safety reviews of proposed research experiments are both essential and routine, and most scientists in U.S. laboratories would welcome a serious, thoughtful, comprehensive review of biological experiments involving potentially harmful microbes. The problem with the proposed rules, though, is that they are vague about which pathogens and what experiments should be targeted. This uncertainty is likely to cause scientists and their institutions to shun too wide an array of research out of an abundance of caution about running afoul of the new regulations. Some of the projects likely to be avoided could be crucial to making advances in biomanufacturing, engineered biology, environmental restoration, and even promising biotechnology applications like rare-earth mineral mining. In other words, the new rules threaten to throw the baby out with the bathwater.
Under the proposed rules, even vaccine development activities would require additional oversight that would exceed any necessary safety or scientific benchmark. This would add expenses and slow the process of developing vaccines, without benefits to either security or safety. Researchers studying diseases like COVID-19 and influenza, whose work has already been evaluated for merit and safety, would be required to assess if their work may be “reasonably anticipated” to result in high or moderate transmissibility or virulence in humans—qualities that usually cannot be predicted in the laboratory. The Government Accountability Office (GAO) has already flagged this language as too vague. The proposed framework also would apply to research on non-pathogens that may theoretically pose risks from being misused, but is written so broadly that it would also target a lot of bioengineering research and development that is essential to U.S. economic competitiveness.
It is imperative to take enhanced safety precautions when trying to scientifically understand dangerous viruses like those that cause influenza. However, opposition to so-called gain of function (GOF) research—which is often characterized very broadly—has become a rallying call for conspiracy theorists who believe that the novel coronavirus came from GOF research performed in a Chinese laboratory, despite scientific evidence that points to a natural emergence. The controversy over the pandemic’s origin has led to the introduction of laws banning GOF research in Florida and Wisconsin. Those bans put at risk groundbreaking influenza and coronavirus research performed in U.S. laboratories. Attempts to regulate GOF research are not straightforward, because GOF cannot be defined precisely. Without knowing what work is and is not allowed, virologists and their research institutions will not know which specific laboratory activities could be deemed as impermissible GOF research and require additional oversight, and could step away from critical work that the United States needs. Modification of biological organisms useful for bioindustrial manufacturing or engineering biology to develop new drugs and materials—areas where the United States is competing against other nations, particularly China, for technical dominance—could also be affected by such far-reaching bans and regulations.
The proposed HHS rules would isolate U.S.-based research internationally. Unsurprisingly, many nations see research on pathogens that cause human disease as a priority in the aftermath of the devastation caused by the coronavirus. Rather than leading the world to maintain and develop safe, secure practices and biosafety norms, which U.S. researchers are well positioned to do, these proposed rules would hamstring U.S. scientists, curtailing their ability to lead and influence safety practices at the international level. Other nations will certainly take the lead in understanding important characteristics of emerging pathogens that could lead to better surveillance and better vaccines. The international norms for research safety, as well as research priorities, will be set by them—not by U.S. scientists. This will have serious negative implications for the U.S. bioeconomy and U.S. influence on international biosafety norms.
The White House called for comments on the HHS advisory group’s proposed rules in September. This should be the result of the review: The White House should proceed with caution, and the proposed rules should be disregarded. Instead, a more thoughtful and more serious process with greater representation from virologists should be initiated to review biosafety and research oversight to provide standards and clarity for the people doing scientific work. This would necessarily include a broader examination of the consequences of the proposed regulations, which is sorely lacking in the current proposal.
Ideally, oversight for virology research should target research activities that incur the most risk, allow beneficial work to continue safely, and not impose unnecessary burdens that either slow needed advances or encourage workarounds. There should be clear regulatory lines, so virologists know which research activities do or do not require additional oversight or reporting. Importantly, the development of the oversight requires the expertise of virologists. Donald Fredrickson, director of the National Institutes of Health during the debates in the 1970s about research on recombinant DNA (rDNA), wrote in his memoir that “[o]ne of the most important lessons to be learned about controversy over use of high technologies … is the absolute requirement for expert opinion.” Yet the expertise of virologists has not been adequately tapped to determine what oversight would be useful, effective, and durable.
The landmark 1975 Asilomar Conference, which produced an oversight framework that allowed beneficial rDNA work to be performed safely, should be emulated now for virology research to produce guidelines for research into viruses that pose a significant threat to humans, livestock, and plants. The guidance should be developed in collaboration with international virology researchers and be understandable and workable for virologists around the world. Involving people with expertise not only in virology but also safety, security, bioethics, and regulatory oversight will result in an oversight framework for scientifically meritorious and critically important virology research.
Virology research has produced life-saving vaccines that have prevented hospitalizations and saved millions of lives. Beyond vaccines, the U.S. bioeconomy represents at least 5 percent of the U.S. economy as a whole, and the coming century will see much more growth—and peer competition. There is a right way and a wrong way to oversee U.S. research responsibly, and making it so that the United States can’t compete internationally or deliver new biotechnology products is the wrong way.
– Gigi Kwik Gronvall is an associate professor in the Department of Environmental Health and Engineering at the Johns Hopkins Bloomberg School of Public Health and senior scholar at the Johns Hopkins Center for Health Security. An immunologist by training, she has written extensively about synthetic biology, preparations for bioterrorism, and the contested origin of SARS-CoV-2. She is also a member of the U.S. State Department’s International Security Advisory Board.
“Published Courtesy of Lawfare“