Two teams of researchers, nearly 40 scientists in all, have announced they will "pause" work on easily transmissible avian influenza viruses while they address concerns that publication of their studies might enable bioterrorists to create superlethal, highly contagious viruses of their own.
Even if publication of the studies is cancelled, the potential exists for hackers to access the information by targeting the researchers through spear phishing and some form of an advanced persistant threat. Given that, perhaps holding back publication of the information necessary to create the virus will only result in a false sense of security.
Should we fear for the copies of the research that might already have been distributed prior to its planned publication? Will withholding the information only disadvantage "hobby" bioengineers while terrorists and state actors gain access through active means?
The U.S. National Science Advisory Board for Biosecurity (NSABB) had recently recommended that the journals "Nature" and "Science" remove certain details from controversial studies on the avian influenza virus (H5N1) to minimize the risk of these findings being misused by would-be bioterrorists. The World Health Organization plans to hold a meeting next month to discuss issues related to publication of the two papers.
Here's the issue: Scientists working with avian flu have created strains that are highly contagious in ferrets and might be or become transmissible to humans though droplets produced by coughing and person-to-person contact. H5N1 is generally not so easily transmitted and because of its pandemic potential such a mutation is cause for great concern.
Public health and security officials fear information about the experiments might enable bioterrorists or rogue states to create their own versions of a deadly flu that could be passed easily from human to human. Such a novel virus could develop to pandemic proportions before a vaccine could be released. Multiple strains could be developed and released simultaneously, increasing the potential morbidily and mortality.
One of the scientists working on the engineered virus, Ron Fouchier of the Erasmus Medical Centre in Rotterdam, the Netherlands, described it "probably one of the most dangerous viruses you can make."
The announcement, made Monday in a letter published in the journals "Science" and "Nature," said the scientists would pause their research for 60 days while the security concerns are addressed.
While this is not today an issue for the average IT department, it does show how research data that could be dangerous in the wrong hands is coming under increasing scrutiny. My guess is many companies have such information -- often without realizing it. Companies may want to start quietly looking for such information -- things such as engineering plans, product formulations, hazardous materials -- to see if it is protected well enough to survive a well-motivated attack.
Poor security practices threaten not just competitive information and trade secrets but may present public safety hazards as well.