For decades, scientists have had reasonable freedom and control over their research and experiments and able to publish and share their work without much inconvenience. The freedom of creativity in the field of science is much like that of an artist – often fueled by an inspiration from other sources, a passion for a unique realm of art (in this case, science), and a natural curiosity. Within reasonable limits, artists and scientists had the world at their fingertips; as long as they weren’t causing a societal disruption or engaging in illegal activity, their work was unregulated and not subject to state interference. Much of the 20th century was filled with astounding scientific advances and rapidly developing technology, as well as concerns for misuse. These concerns persist in the 21st century, as technology and science continue to advance further. With the continued growth of scientific knowledge and technological development, awareness of the risks associated with the misuse of scientific knowledge and new technology has continued to increase significantly – especially in microbiological research.
Microbiological research threats emerged on the public radar when anthrax strains used in the 2001 mailings to several United States government officials and citizens were found to have originated from the United States Army Medical Research Institute of Infectious Diseases (USAMRIID) in Fort Detrick, Maryland. While senior biodefense researcher Dr. Bruce Ivins was the primary suspect for the anthrax mailings mainly due to his unauthorized decontamination of several areas of USAMRIID, his involvement is still unknown today. Since then, scientists have been scrutinized for working on certain research topics and published research literature labeled as “sensitive.” Ron Fouchier, a scientist at Erasmus Medical Center in Rotterdam, Netherlands, completed research and wrote a research paper in 2011 on laboratory-created strains of H5N1 avian influenza. During the course of his research he faced pressure from the Dutch government over the content of the research paper that contained potentially dangerous information that might essentially teach someone how to create synthetic H5N1. In 2012 the U.S. magazine, Science, was to publish the paper until the U.S. government stepped in to block the paper from being published. Eventually Fouchier and the National Science Advisory Board for Biosecurity (NSABB), an advisory committee for the United States government, had come to a compromise about the publication – that it could only be published if sensitive information were removed from the article. After that decision came proposals to create a system only accessible to “responsible scientists” where the removed sensitive information could be viewed. But who is responsible for deciding which scientists are responsible? And what makes one scientist more responsible than another? Which qualities would one use to measure how reliable a scientist is: Credentials? Previous research? Educational background? Possession of a criminal record? While it is an interesting point to consider, society can’t make these decisions based on arbitrary methods of identification. There is no way to know if the Harvard educated, award-winning, highly skilled professor with a spotless criminal and driving record is going to be more trustworthy than the man who hasn’t published any major papers, committed a misdemeanor in his freshman year of college, and has not yet been able to contribute anything to the scientific community. In a radio interview for Science Friday, Dr. D.A. Henderson, a distinguished scientist and epidemiologist at the University of Pittsburgh Medical Center’s Center for Biosecurity, pondered what this would mean for the scientific community. Scientists might be turned down for grants or jobs arbitrarily, which would prove to be disruptive to the fundamental tenants of scientific inquiry as well as to the basic rights of those individuals who would not understand why there weren’t chosen for access to the exclusive system.
Upon realization of the possible dangers on publishing certain components of scientific research, the United States government assembled the NSABB, a panel of voting members with expertise in medicine, life sciences, national security, and other related fields. The NSABB assisted in addressing issues related to biosecurity and dual use research in 2004. Decisions made by the NSABB have no legal authority and their findings are strictly advisory. Unfortunately, the majority of scientific work in the United States is funded by a government entity, and refusal to comply with NSABB’s advice could result in the reduction or loss of funding. An NSABB decision, while in the best interest of national security and the safety of our citizens, could have a chilling effect on research and advancement. Knowing that one’s research may be abridged to omit sensitive details, or blocked from publication, could discourage scientists from publishing – or even attempting – certain types of research. History has shown that general open access to scientific research publication contributes to many advancements and scientific breakthroughs. Science is a field in which breakthroughs are built upon past innovations and discoveries. Restricting the publishing of research could negatively impact such scientific progress in the long run.
There is no question that sensitive scientific information needs to be watched closely, but there does not seem to be a plausible solution to the problem at this time. The new restrictions and regulations on scientific research are meant for national security, but at what point does national security encroach on the right of free speech? At what point do we allow national security concerns to impede the scientific process upon which so many societal advancements are based? This debate not only has technical implications, but is an ethical quandary as well.
As is the case with many ethical debates, there is no perfect solution. A sound strategy begins with the heavy involvement of the scientific community in the discussion. Fortunately, members of the scientific community are engaging on this topic. A 2007 study analyzed literature centered around the ethics of biodefense and dual-use research of concern from the Medical Literature Analysis and Retrieval System (MEDLINE) database, which holds bibliographical information for academic science journals. Ten articles met their inclusion criteria, and the study concluded that self-regulation within the scientific community, international cooperation, and increased security were the top three suggestions for minimizing the risks presented by dual-use research. Conscientious self-regulation would allow scientists to oversee their own research and associated literature without concerns of compromising the quality of their publications. Additionally, international cooperation would unify a larger group of scientists who may possess similar concerns against the problem. Finally, better cooperation establishes stronger safety and security measures through focused peer review. Combined, these three measures can increase security and make the misuse of sensitive scientific information more difficult for people with access to it, and with increased safety education and clarity of dual use definitions could further decrease the risks from misusing science.
 Herfst, S., Schrauwen, E. J., Linster, M., Chutinimitkul, S., de Wit, E., Munster, V. J., & Fouchier, R. A. (2012). Airborne transmission of influenza A/H5N1 virus between ferrets. science, 336(6088), 1534-1541.
 Anand, N.S. (Producer). (2012, January 06). Debate persists over publishing bird flu studies [Audio podcast]. Retrieved from http://www.sciencefriday.com/segment/01/06/2012/debate-persists-over-publishing-bird-flu-studies.html
 Dolgitser, M. (2007). Minimization of the Risks Posed by Dual-Use Research: A Structured. Journal of the American Biological Safety Association, 12(3), 175. Retrieved from http://www.absa.org/abj/abj/071203dolgitser.pdf