Mega-omics!
We're in the marketing age, make no mistake. In life science it's the Age of Omni-omics. Instead of innovation, which both capitalism and science are supposed to exemplify, we are in the age of relentless aping. Now, since genetics became genomics with the largesse of the Human Genome Project, we've been awash in 'omics': proteomics, exomics, nutriomics, and the like. The Omicists knew a good thing when they saw it: huge mega-science budgets justified with omic-scale rhetoric. But you ain't seen nothing yet!
Now, according to a story in the NY Times, we have the Human Connectome Project. This is the audacious, or is it bodaceous, and certainly grandiose grab for funds that will attempt to visualize and hence computerize the entire wiring system of the brain. Well, of some aspect of some brains, that is, of a set of lab mouse brains. The idea is to use high resolution microscopy to record every brain connection.
This is technophilia beyond anything seen in the literature of mythical love, more than Paris for Helen by far. The work is a consortium so that there will be different mice being scanned, and these will be inbred lab mice, and all that goes with their at least partial artificiality. The idea that this, orders of magnitude greater complexity than genomes, will be of use is doubted even by some of the scientists involved....though of course they highly tout their megabucks project--who wouldn't?!
Eat your heart out, li'l mouse!
One might make satire of the cute coarseness of the scientists who, having opened up a living (but hopefully anesthetized) mouse, to perfuse its heart with chemicals to prepare the brain for later sectioning and imaging, occasionally munch on mouse chow as they do it. Murine Doritos! Apparently as long as the mouse is knocked out you can do what you want with them (I wonder if anyone argues about whether mice feel pain, as we now are forced to acknowledge that fish do?).
This project is easy to criticize in an era with high unemployment, people being tossed out of their homes, undermining of welfare for those who need it, and in the health field itself.....well, you already know the state of health care in this country. But no matter, this fundamental science will some day, perhaps, maybe help out some well-off patrons who get neurological disease.
On the other hand, it's going to happen, and you're going to pay for it, so could there be something deeper afoot, something with significant implications beyond the welfare of a few university labs?
But what more than Baloney-omics might this mean?
The Enlightenment period that began in Europe in the 18th century, building on international shipping and trade, on various practical inventions, and on the scientific transformations due to people like Galileo and Newton, Descartes and Bacon, and others, ushered in the idea that empiricism rather than Deep Thought was the way to understand the world. Deep Thought had been, in a sense, the modus operandi of science since classical Greek thought had established itself in our Western tradition.
The Enlightenment changed that: to know the world you had to make empirical observation, and some criteria for that were established: there were, indeed, natural laws of the physical universe, but they had to be understood not in ideal terms, but by the messiness of observational and experimental data. A major criterion for grasping a law of nature was to isolate variables and repeatedly observe them under controlled conditions. Empirical induction of this kind would lead to generalization, but this required highly specific hypotheses to be tested, what has since that time come to be called 'the scientific method'. It has been de rigeuer for science, including life science, ever since. But is that changing as a result of technology, the industrialization of science, and the Megabucks Megamethod?
If complexity on the scale of things we are now addressing is what our culture's focus has become, then perhaps a switch to this kind of science reflects a recognition that reductionism is not working the way it did for the couple of centuries after its Enlightenment launching. Integrating many factors that can each vary, into coherent or 'emergent' wholes, may not be an effective approach, and enumerating the factors may not yield a satisfactory understanding. Something more synthetic is needed, something that involves reductionistic concepts that the world is assembled from fundamental entities--atoms, functional genomic units, neural connections--but that to understand it we must somehow view it from 'above' the level of those units. This certainly seems to be the case, as many of our posts (rants?) on MT have tried to show. Perhaps the Omics Age is the de facto response, even a kind of conceptual shift that will profoundly change the nature of human approach to knowledge.
The Connectome project has, naturally, a flashy web site and is named 'human' presumably because that is how you hype it, make it seem like irresistible Disney entertainment, and get NIH to pay for it. But the vague ultimate goal and the necessity for making it a mega-Project may be yet another canary in the mine, an indicator that, informally and even sometimes formally, we are walking away from the scientific method, away from specific hypotheses, to a different kind of agnostic methodology: we acknowledge that we don't know what's going on but, because we can now aim to study everything-at-once, the preferred approach is to let the truth--whatever form it takes, and whether we can call it 'laws', emerge on its own.
If that's what's happening, it will be a profound change in the culture of human knowledge, that has crept subtly into Western thought.
2 comments:
I think that deep thinking and the ability to filter information are endangered skills. It will be interesting (and scary) to watch as the public begins to realize each individual has many genomes not just one... I am hopeful there is a strong minority resisting technology's silent [hostile] takeover of our brains and see technology as a supplementary tool not a substitute for critical thinking.
This post made me think about a book that's been on my list but one which I haven't read yet: Nicholas Carr's "The Shallows: What the Internet is Doing to Our Brains"
I'm not convinced about the endangered skills view, only in the sense that one wonders if things were ever very different. Hasn't the ability to see a broader picture always been rather rare?
Anyway, enlightenment science, and who knows what else, has programmed us to expect simple answers, but we've always known about the differences. Anyway, our society is very technophilic for various reasons, not least being that technology delivers many things we take as good--like novocaine and (perhaps?) cell phones and the internet and huge-size televisions to mesmerize us.
I share your hope about a resisting minority, but I'm not convinced it's real....yet. However, if technology doesn't succeed in predicting everything and fixing everything that we don't like, we may get bored with it, or at least put it into some sort of better-balanced perspective.
Post a Comment