A recent popular book called Bad Science, by British op-ed columnist and physician Ben Goldacre, presents a clear and readable treatment of the way in which poor quality science, intentional misrepresentation, or poor understanding of science on the part of consumers, journalists, companies and even some scientists lead to, well, bad science.
He takes on those who claim that homeopathy works, he discusses controversies like the idea that MMR vaccines are dangerous, and he discusses the way Pharma manipulates and maneuvers with their bottom line rather than fidelity to facts (or society at large) in mind. He points out ways in which study designs and statistical interpretations can represent, or misrepresent, a causal situation. It's a good read, and would be especially good for anyone, especially students, in any field of science (including social science as well as genetics, medicine, or public health) to read.
Cost of Opportunity Patrols (COPs)
One of the issues that Goldacre raises has to do not with whether a given bit of science is bad, intentionally or not, but with the consequences. After all, scientists tend to dismiss much of the criticism leveled at them and at over-claims and the like, by saying that science is a self-correcting process. Incorrect results, whether dishonest or inadvertent, fail to have influence or mistakes are caught because people try to replicate the results and fail, then have to figure out why.
But that is only a partly good argument, probably doing more service in support of the status quo rather than stronger oversight and stricter standards, than it does to justify tolerance of bad science. Of course, we can't catch all bad science when it's done, because much of science is far too specialized, costly, and complex for an experiment to be checked or replicated. Bad results do eventually lose any luster they might have had, but history shows that that can take decades or (in the case of things like alchemy and phlebotomy) centuries.
The issue Goldacre raises is one that we have raised as well, here on MT and elsewhere. It is this: for every direction science takes, there is a cost. That cost can be seen in the form of the lost opportunity to do something else, that might have panned out better. The lost opportunity isn't just, or perhaps isn't mainly, due to 'bad science', but to the consequences of decisions to invest here rather than there, no matter how honorable and legitimate the science itself is.
We have very little in the way of Cost of Opportunity Patrols, to look out for and identify investments that should be diverted elsewhere, either because it's too costly, too unlikely to have payoff for what the funders expect (e.g., real health improvements), too marginally incremental, or simply that there are more urgent issues that need solving or are more likely to be actually solved in the predictable future.
In reality, such COPs would either not be tolerated, or would be manned (and womanned) by the same people who do the work that needs COPing. Science can be judged best by specialists, but they have to be disinterested, and in these days such people are in limited supply.
Further, suppose the COPs really tried to stop a major direction of science and direct the funds elsewhere. The same kind of waste would arise in the new area (because all the same people would flock to it, as happens now whenever something potentially new and fundable arises), often providing rationales that give their interests apparent 'fit' to the new area. It's a hydra phenomenon. But can we be blamed? The problems are that our livelihood is at stake, and we've become so specialized that we can hardly even open a door with a new kind of handle.
Worse, the political opposition to loss of funds by current recipients would make bureaucrats shake in their boots, lest Senators would quietly threaten to curtail their budgets if the change is made, etc. History shows that bureaucrats back away from real change or rationalize inaction and delay. So rather than do something different, the pressures are to hyper-hype what you're doing now.
We are investing large amounts of money in genetically related projects like biobanks and GWAS that we and others think ought to be outed by the COPs, because they use a lot of funds for little payoff, relative to what could be gained by focusing on things that really are genetic, until we show that genetic knowledge really can lead to the elimination of such diseases. Right now, the record is pretty weak in that department, but it should be and could be much better.
The problem is that while peer review is vested-interest review through and through to a great extent, finding better directions to pursue involves both new sets of similar vested interests but, from a scientific viewpoint, true uncertainties. The riskiest kind of grant to apply for, as everybody knows very well, are the riskiest ideas. How can we really know that this or that specific new direction would pay off better, per dollar spent, than GWAS or whatever else will? After all, none are complete busts.
Scientists love to rationalize their low-payoff work by saying that next century will show that our fine insights led to a worldwide transformation. One example we recently heard is defense of the expense of the Large Hadron Collider (which physicists are already hyping for LHC mega-projects beyond the discovery of the shy Higgs Boson that is expected this year, to keep those bosons spinning). After all, they say, Maxwell discovered electromagnetism and that led to the transformation of the world by electric power and electronics a century later. Think of your iPod and microwave oven. The argument that the money could feed large numbers of children and give them longer, productive lives is just dismissed as if it were irrelevant (is it?).
Of course, there are such examples, and it's understandable that physicists would make use of them. The same way that NASA justifies its budget in searching for life on Alpha Centauri (or Mars) because the moon project gave us Tang and Teflon and the profits from Apollo 13, The Movie.
Maybe a solid historiographical analysis of the past would give us criteria by which to set the COPs in motion, to know when to stop or cut back on projects that have been around for a while, and shift the investment elsewhere. Maybe even just shaking up the system so it cannot become entrenched would do that to a useful degree. Not, of course, as long as faculty salaries depend on grants and universities on grant overhead money. But this kind of reform would be no different than letting industries thrive or shrivel depending on whether they generate good products that somebody wants, which seems perfectly wonderful in our society today. Science by contrast really shouldn't be just driven by short-term bottom line, but we clearly see the need of COPs in the sense of things that really do become stodgy, despite excited reviews written to lobby for increased support.
It's easy to complain about limited successes in science, when serious science attacks tough problems that are truly hard to solve--as in genetics and evolution are. But the more the hype, the more grandiose the promise, and the more the funds, perhaps the more should be the accountability.