The four-part In Our Time series with Melvyn Bragg, on the history of Britain's Royal Society, aired last week on BBC Radio 4 was a fascinating reprise of the history of science as a reflection of society. The Royal Society was founded by King Charles in 1660, right after the English Civil War, in part, as Bragg said, "to interrogate Nature herself, for the improvement of man's lot". King Charles also explicitly hoped that the new organization of learned men would help enrich the nation.
Science has always needed patrons, and in the past, especially in the US, this has included philanthropists. But the twentieth century saw a "great shift [in science] away from the academy towards industry and war," such that science has become so expensive that only the richest of patrons can fund it now. Generally, that would be the government--the military or agencies, with explicit health or science-related mandates--but private companies are also in on the game. Pharmaceuticals are big, but chemical or food manufacturers, or computer engineering firms are up there, too.
One of the speakers on the In Our Time episode aired on January 7 described the typical scientist these days as not a professor or a grad student or a post-doc, but someone managing a medical trial run by a private contract research organization, probably a multinational, with the trial probably happening somewhere in the Third World. Science now follows the money. And it's planned, directed and applied.
But, as pointed out on the program, many of the best scientists would say that their own great work was unpredictable, and couldn't have been preconceived and proposed in advance to a funding agency. If science is only responsive to immediate needs, then the kind of high-risk, high-payoff work that has been so successful in the past isn't going to get done.
Many scientists working away in the incremental business-as-usual-with-a-minor-tweak world, and that means most of us, often argue that continued funding will lead to serendipitous discoveries, as though this justified the current system. But those mainly come from the quiet brilliance of (generally young) investigators concentrating hard on focused questions, not from bureaucratic cogs. Our system doesn't particularly favor the former, but it does favor the latter who learn how to 'game' it for resources.
But does this matter after all? Driven in part by military research, technology has been advancing as never before in history. And public health and pharmaceuticals have made major contributions to improving quality and length of life, at least in populations that can afford them. Perhaps the idea of the eccentric genius, toiling away alone in the lab, thinking big thoughts and making accidental discoveries is now an anachronism. Maybe the Hadron supercollider, if the forces-that-be finally allow it to crank up to speed, as hugely costly and well-planned as it has been (except for things like bits of baguettes dropping in to gum up the works) will actually make unexpected discoveries, putting the lie to the idea that the best discoveries can't be predicted. Perhaps major advances in quality of life now require political action, in terms of resource distribution and so on, rather than scientific.
But we don't think so. The view from within makes it hard to see alternative landscapes -- in the same way that a short-sighted view can seduce us into thinking we're at The End of History. There are enough rewards in the grant system for enough people that it's hard to rock the boat. But, as we've written before (e.g., here), it wouldn't take huge changes to improve things greatly. Basic changes to the grant system and the system of rewards in academia, could go a long way toward increasing innovative thinking.