Adam's sin
What else could we expect? While not part of Aristotelian or classical Greek scholarship, the Judeo-Christian understanding of the cosmos was that when Eve fell to temptation and ate from the forbidden fruit of the tree of knowledge of good and evil, and in turn tempted Adam, mankind's vision was blurred: in our sin, we lost the ability to see Nature clearly. Forever benighted, then, we had to turn to the religious authorities, and they (that is, mainly the Catholic authorities) had accepted Aristotelian worldviews as basically correct--and hence unchallengeable.
Rubens' Adam and Eve |
Telescopes, microscopes, and clocks opened huge new frontiers, and their major impact was on what became modern science as we know it. The iconic instances are Galileo's use of telescopes on the moon and planets, and Newton's use of mathematics and more accurate astronomical data, together showing us that the accepted Aristotelian wisdom about the nature of the universe was inaccurate (or, sometimes, just plain wrong). Meanwhile, as astronomers gazed and measured things big, the likes of Robert Hooke looked at things small. Various devices like the vacuum pump (Hooke helped make one for Robert Boyle) contributed to a new sense of chemistry and physiology.
Hooke microscope; 1665 |
The Enlightenment period that these activities helped usher in, was one in which empirical observation of Nature rather than mere thought and dogma, became watchwords. We could understand--and control--the world through what we now would call technology. The ideas were extended to the belief that by acting rationally, by being reasonable, we would understand. This became the belief in what science should be about, and of course we still share that view today.
At the time, there was no substitute for technology, when so much became quickly discernible to us that our biological senses could not detect. Chemistry, physics, astronomy, and biology all became formalized disciplines through various uses of technology. Geology and evolution owed their modern origins in part to better navigation and global economies, and so on.
In those times, technology drove science because, in a sense, we were seeing things for the very first time, and there was so much to see. But in the process, technology became part of the scientific method: if you did not document something with appropriate detection and measurement technology, your 'theory' about Nature would be viewed as mere speculation. Even Darwin used microscopes, global travel technology, and his era's version of 'Big Data', along with globally expanded measures of geological formation and change, to develop his theory of evolution.
Instrumentation became more elaborate and expensive, and science gradually morphed from private rich-men's basement labs to university and other institutionalized research centers. In a cultural sense instrumentation became what was needed for cachet, as well as the vital tool for understanding the problems of the day from physics to medicine. Generally, this was a correct understanding of the state of things, though one could argue that investigators thought up the ideas they did because they had instrumentation.
The clockwork universe
The Enlightenment idea was to find through such instrumentation that God's creation was a clockwork universe that, like the instruments being developed, worked in universally, according to specific--and discoverable!--principles that gave certainty to cause-and-effect patters if but we knew the laws of Nature that drove them.
But the laws of the creation had been kept from our vision by the serpent who had tricked Eve into her forbidden taste test. All it showed to Adam and Eve was that they were naked and needed to find the nearest fig leaf. But now, thousands of years later, us flawed humans had developed ways to take the scales off our eyes and see what we would otherwise have been able to see.
The astounding realization, one might say, was that the universe was a clockwork place, a place of laws that applied always and everywhere and thus were ordained by God when the universe was created. What else could it mean? The success of the many great and lesser luminaries who avidly pursued instrument-based observations, and the exciting new ethos of empiricism before theory, livened up a basically new area of human endeavor: organized science and technology.
Next-Generation sequencing as an icon of current thinking
This has evolved rapidly and it integrated well with commercially based economies, so that multiple inter-dependencies grew. We not only depended on technology for finding things not findable before, but we used technology to drive what we would study, even as a rationale. That is currently the system today. We drop various terms, knowingly, to justify or give gravitas to our work. Thus 'next generation' DNA sequencing becomes what you have to propose if you expect to get grant funding, even if the old technology could have answered your question. Indeed, that's how you get more DNA than the next fellow, so you can study what s/he cannot 'see', and then getting more becomes part of the point.
Older generation, Sanger sequencing; Wikipedia |
The problem, for those of us who see it as a problem, is not the value of the technology where appropriate, but that it first of all determines the very questions we ask (or that being in the science tribe will let you get resources for), and secondly leads to repetitive me-too, ever-larger but often not better thought-out work. The costly imitative nature will reach, if it hasn't already reached, a point where the incremental new knowledge is as trivial as collecting yet another tropical flower or beetle would have been to Victorians.
When the point of diminishing returns is reached, so is a kind of conceptual stagnation. When and whether we're at such a point, or where we are, is certainly widely debated. Is another, bigger GWAS, with sequence rather than marker data, going to deliver a body blow to, say cancer or heart disease? Or will it just add some pin-striping to what we already know? Will identifying every neural connection in a brain be worth billions? Or will it be mainly another form of very costly beetle collection?
Where and how can we focus the incredibly powerful tools that we now have to things that we truly have never been able to see before. Are we pressing Enlightenment science to its useful limits, in need of better kinds of insights than we can get from ever more Big Data? Or is the priority given to sexy instrumentation still worth the central or even driving role it plays in today's science?
It is a cultural fact that we are so assured of, believe in, depend on, or hide behind (?) technology in our industrial-scale science system, that technology has pride of place these days. That cultural fact gives a lot of us jobs and a feeling of importance. In many areas it's also a feeling of profound discovery. Some of that really is profound. But just as profoundly, much of it is a waste of creative energy. Finding a right balance is something that will have to happen on its own, probably involve the chaos of the modern form of disputation sphere (such as blogs), since it can't be ordered up by some sort of administrative decision.
3 comments:
Leaving a link to our follow-up comment and a summary for the convenience of your readers -
http://www.homolog.us/blogs/blog/2013/08/20/enlightenment-technology-and-rise-of-democracy/
"In our opinion, the discussion of Ken Weiss missed a third dimension that played important role in the transformation of science. It is the change of society to full-blown democracy and complete acceptance of the democracy God. A reading of Oswald Spengler’s book, where he presented four stages of civilizations, would be helpful.
When we worked in theoretical physics, we saw a different process of birth of new ideas from today. Many new fields started with some unknown young physicist writing one paper that went unnoticed. Then he wrote another one, which got barely followed by his friends. Three or four papers later (and if he was lucky), an outsider took notice and contributed with his own paper. If everything went well, after almost a decade, the original author could write a ‘Reviews of Modern Physics’ paper summarizing the achievements of him and others. Only then the commoners found out that something exciting was going on. Compare that with democratic science, where we try to judge papers based on their immediate ‘success’ in twitter.
Democracy has got so ingrained in our scientific fabric that we do not realize in how many ways it affects us. As an example, the reward system of scientists is based on government funding, but the government is becoming more and more democratic or crowd-pleasing. Crowd is never pleased with esoteric science and likes to see something of ‘immediate importance’. As a result, scientists either lie in their grant applications (death of Feynman’s rules for scientists) saying that their work will cure cancer, or scientists move away from spending their life on solving esoteric problems. What is wrong in scientists overstating their case in grant applications as long as the work eventually helps the society? It is that the culture of lying allows the entry of ‘social scientists’, who manages to redirect science elsewhere.
Democracy, which leads to short-term-ism, is not consistent with enlightenment science."
Needless to add that USA's decline is not reversible, but new NGS technologies will surely find use in proper science in a different part of the world. We strongly believe east Asia is that region.
To a great extent, the newcomer has the advantage because they don't have to rebuild their infrastructure and cultural inertia. So if Asia is doing that, they have an edge.
On the other hand, the commercial system is such that Asia might just be copying us only with all-fresh equipment. There are signs that China will just be more and bigger--and I know some who say let them do that, while we innovate.
Time will tell. But right now we certainly have a system that reinforces a do-the-same but pretend it's different mentality.
Post a Comment