Tuesday, July 15, 2014

More on IRBs and restraint on science

Stories over the last couple of days lead us to interrupt our series on natural selection with this brief post on research and ethical conundrums. It's a continuation of a couple of posts from last week (here and here) about IRBs (ethical review committees), bioethics and the idea of societal restraints on what scientists do or are permitted to be funded to do.

Nobody likes restraint, but science is supported by the public and science also has important public implications.  Nothing human is perfect or without potential down sides, including risks.  If society expects to benefit from new knowledge, it will have to pay for things that go nowhere and will also have to assume some risk.  The problem is how to assess work that shouldn't really be done, or paid for, and how to assess risk.
Our posts last week were about the indomitable scientist whose controversial work on engineering viruses--gain-of-function experiments--to make them dangerous went on despite disagreement about the public health consequences of the work.  Is it more important to protect public health by exploring the viruses in greater detail and thus enable the manufacture of better vaccines, or to do absolutely everything possible to prevent the public health disaster that could ensue if the viruses were to escape the lab?  That is, not make the organisms in the first place. 

Again, scientists are basically no more or less honorable than others in our society, and our society isn't exactly famous for its collective unity.  To the contrary, in a selfish society like ours, and when scientists have an idea and there may be money to be made (commercially or in grant funds), or potentially great public benefit, they are going to do what they can think up to get around rules that might stymie the objects of their desires.  That may include shading on honesty or not being as clear or forthcoming, or obfuscating.  Whatever works.  We're great at that--as can be seen in research papers (often, buried in the massive 'Supplemental' material!).  If you think that's not how things work, what planet do you live on?

But by coincidence, just since our posts about IRBs, infectious disease and science ethics, several significant and relevant events have come to light.  Six vials of smallpox virus were found buried in a lab freezer at the NIH in Washington, having been there since the 1954 (as reported by infectious disease writer Maryn McKenna in one of her fine series on this issue) when research into vaccines against the disease was underway, and before smallpox was eradicated; the last case on Earth was seen in 1978.  The vials were sent to the Centers for Disease Control in Atlanta (CDC), where it was discovered that 2 of them contained viable virus; McKenna reports that the samples will be destroyed after they've been thoroughly analyzed.

Officially, only 2 labs in the world still have smallpox samples; the CDC and a lab in Siberia.  The rationale for maintaining these stockpiles is that this would quickly enable whatever research would be necessary if the disease were to reappear -- presumably through biological warfare or terrorism rather than accidental release, but this does now put the latter possibility on the table.

But then in a widely reported story, the CDC found a 'lapsed culture' in infectious disease laboratories, the lax control potentially exposing workers to anthrax and shipping dangerous flu viruses.  The labs have been closed at least temporarily and external review requested.  No one hurt--this time.

Researchers do need to send potential dangerous samples to collaborators, and to work on them in their own labs (where, of course, employees do the actual work, not investigators).  The problem is that if or when an accident does occur, it could be of massively awful proportions.  The problem isn't new -- indeed, McKenna links to a 2007 piece in the USA Today reporting a long list of accidents in US labs handling "deadly germs".  Where is the line, and how do we draw it, to balance between the self- or selfish interest of scientists, the proper concern of government for public health measures, the potential for personal or corporate gain, the potentially major benefit to society, and the risks due to culpable avoidance of ethical standards (such as getting around the IRBs) or ordinary human fallibility?

Life involves risks as well as gains.  But unfortunately, risky research requires regulatory decisions about these issues by insiders, those with the knowledge but also conflict of interest, because it involves regulating themselves.  This is an area in which the public doesn't seem to have adequate means to be the guardians of our own interests.  No obvious solution comes to mind.

No comments: