Thomas Kuhn's 1962 book The Structure of Scientific Revolutions suggested that most of the time we practice 'normal' science, in which we take our current working theory--he called it a 'paradigm'--and try to learn as much as we can. We spend our time at the frontiers of knowledge, and at some point we have to work harder and harder to make facts fit the theory. Something is missing, we don't know what, but we insist on forcing the facts to fit.
Then, for reasons hard to account for but in a way that happens regularly enough that it's a pattern Kuhn could outline (even if rare), someone has a major insight, and shows how a totally unexpected new way to view things can account for the facts that had heretofore been so problematic. Everyone excitedly jumps onto the new bandwagon, and a 'paradigm shift' has occurred. Even then, some old facts may not be as well accounted for, or the new paradigm may just explain issues of contemporary concern, leaving older questions behind. But the herd follows rapidly, and an era of new 'normal science' begins.
The most famous paradigm shifts involve people like Newton and Galileo in classical physics, Darwin in biology, Einstein and relativity, and the discovery of continental drift. Because historians and philosophers of science have in a sense glamorized the rare genius who leads such changes, the term 'paradigm shift' has become almost pedestrian: we all naturally want to be living--and participating--in an important time in history, and far, far too many people declare paradigm shifts far too frequently (often humbly referring to their own work). It's become a kind of label to justify whatever one is doing, a lobbying tactic, or a bit of wishful thinking.
Is 'omics' a paradigm shift?
The idea grew out of the Enlightenment period in Europe starting about 400 years ago, that empiricism (observation) rather than just thinking, was the secret to understanding the world. But pure empiricism--just gathering data-- was rejected in the sense that the idea was for the facts to lead to theoretical generalizations, the discovery of the 'laws' of Nature, which is what science is all about. This led to the formation of the 'scientific method,' of forming hypotheses based on
current theory, setting up studies specifically to test the hypothesis,
and adjusting the theory according to the results.
If the 17th-19th centuries were largely spent in gathering data from around the world, a first rather extensive kind of exploration. But by the 20th century such 'Victorian beetle collection' was sneered at, and the view was that to do real science you must be constrained by orderly hypothesis-driven research. Data alone would not reveal the theory.
With advances in molecular and computing technology, and the complexity of life being documented, things changed. In the 'omic' era, which began with genomics, the ethos has changed. Now we are again enamored of massive data collection unburdened by the necessity to specify what we think is going on in any but the most generic terms. The first omics effort, sequencing the human genome, led to copy-cat omics of all sorts (microbiomics, nutrigenomics, proteomics, .....) in which expensive and extensive technology is thrown at a problem in the hope that fundamental patterns will be revealed.
We now openly aver, if not brag, that we are not doing 'hypothesis-driven' research, as if there is now something wrong with having focused ideas! Indeed, we now often treat 'targeted' research as a kind of after-omics specialty activity. Whether this is good or not, I recently heard a speaker refer to the omics approach as a 'paradigm shift'. Is that justified?
Before we could even dream about genomic-scale DNA sequencing and the like, we must acknowledge that our understanding of genetic functions and the complex genome had perplexed us in many ways. If we had no 'candidate' genes in mind-no specific genetic hypothesis--for some purpose, such as to understand a complex disease, but were convinced for some reason that genetic variation must be involved, what was the best way to find the gene(s)? The answer was to go back to 'Victorian beetle collection'. Just grab everything you can and hope the pieces fall into place. It was, given the new technology, a feeling of hope that this might help (even though we had many reasons to believe that we would find what we indeed did find, as some of us were writing even then).
The era of Big Science
Omics approaches are not just naked confessions of ignorance. If that were the case, one might say that we should not fund such largely purposeless research. No, more is involved. Since the Manhattan Project and a few others, it did not escape scientists' attention that big, long, too-large-to-be-canceled projects could sequester huge amounts of funding. We shouldn't have to belabor this point here: the way universities and investigators, their salaries and careers, became dependent on, if not addicted to, external grants, the politics of getting started down a costly path enabling one to argue that to stop now would throw away the money so-far invested (e.g., current Higgs Boson/Large Hadron Collider arguments?). Professors are not dummies, and they know how to strategize to secure funds!
It is fair to ask two questions here:
First, could something more beneficial have been done, perhaps for less cost, in some other way? Omics-scale research of course does lead to discoveries, at least some of which might not happen or might take a long time to occur. After the money's been spent and the hundred-author papers published in prestige journals, one can always look back, identify what's been found, and argue that that justifies the cost.
Second, is this approach likely to generate importantly transformative understanding of Nature? This is a debatable point, but many have said, and we generally agree, that the System created by Big Science is almost guaranteed to generate incremental rather than conceptually innovative results. (E.g., economist Tim Harford talked about this last week on BBC Radio 4, comparing the risk-taker science of the Howard Hughes Institutes with the safe and incremental science of the NIH.) Propose what's big in scale (to impress reviewers or reporters), but safe--you know you'll get some results! If you compare 250,000 diabetics to 500,000 non-diabetic controls and search for genetic differences, across a genome of 3.1 billion nucleotides, you are bound to get some result (even if it is that no gene stands out as a major causal factor, that is a 'result'). It is safe.
This is not providing a daring return on society's largesse, but it is the way things largely work these days. We post about this regularly, of course. The idea of permanent, factory-like, incremental, over-claimed, budget-inflated activity as the way to do science has become the way too many feel is necessary in order to protect careers. Rarely do they admit this openly, of course, as it would be self-defeating. But it is very well-known, and almost universally acknowledged off the record, that this strategy of convenience seriously under-performs, but is the way to do business.
Hypothesis-free?
This sort of Big Science is often said to be 'hypothesis free'. That is a big turn away from classical Enlightenment science in which you had to state your theory and then test it. Indeed, this change itself has been called a 'paradigm shift'.
In fact, even the omics approach is not really theory- or hypothesis-free. It assumes, though often not stated in this way, that genes do cause the trait, and the omics data will find them. It is hypothesis-free only in the sense that we don't have to say in advance which gene(s) we think are involved. Pleading ignorant has become accepted as a kind of insight.
For better or worse, this is certainly a change in how we do business, and it is also a change in our 'gestalt' or worldview about science. But it does not constitute a new paradigm about the nature of Nature! Nothing theoretical changes just because we now have factories that can systematically churn out reams of data. Indeed, the theories of life that we had decades ago, even a century ago, have not fundamentally changed, even though they remain incomplete and imperfect and we have enormous amounts of new understanding of genes and what they do.
The shift to 'omics' has generated masses of data we didn't have before. What good that will do remains to be seen, as does whether it is the right way to build a science Establishment that generates good for society. However that turns out, Big Science is certainly a strategy shift, but it has so far generated no sort of paradigm shift.
I happen to'v been planning on discussing Kuhn and paradigms in one of my classes tomorrow - THIS IS LIKE THE MILLIONTH TIME YOUR BLOG HAS SERENDIPITOUSLY SAVED MY LIFE! Thanks!
ReplyDeleteThis is good to hear!
DeleteActually, the ideas were largely anticipated, in terms of how science works to use or overlook facts depending on how they fit a current working model ('paradigm' in Kuhn's term), by Ludwik Fleck in the 1930s (http://en.wikipedia.org/wiki/Ludwik_Fleck) in his book The Genesis and Development of a Scientific Fact.
Kuhn in some ways cribbed from Fleck without full (or, initially, any) acknowledgment.
Fleck dealt with medical issues such as how anatomy is presented, and most importantly, how facts were filtered in the search for understanding and diagnosis of syphillis. Fleck is worth reading, and sociologists and historians of science are now point to him rather than Kuhn for these ideas. There has also been a lot of subsequent analysis of how science works.
But certainly Fleck (via Kuhn and by Kuhn's own new contributions) have influenced not just the sociologists of science, but also the kind of self-importance scientists give themselves by referring to 'paradim shift' far too often, and too often in regard to their own work!
Real paradigm shifts, to the extent that this is an appropriate term, are far and few between...
This is a paradigm changing post! Cheers!
ReplyDeleteWishing to be living in (and a contributor to!) an important time in science is no different, at all, from Christians hungering to be living in an important time--the End Time, Armageddon!
DeleteJesus told his disciples that they'd soon see him when he returned. I guess he exaggerated. Francis Collins and others seem to be promising similar Final success, and a lot of people are going along with him in that, and they're the ones proclaiming their involvement in a paradigm shift!
Nobody wants to be living in a Dark Age....
Your reply reminds me of a comparison between NIH and a cult. See here:
ReplyDeletehttp://www.michaeleisen.org/blog/?p=1217
Very clever! Of course, it's a commentary on our sheepish, tribal human society generally, that we behave this way. Scientists are in this respect just like everyone else.
Delete