Well, not entirely. There was a curmudgeonly economist named Hyman Minsky (1919-1996). We're not economists and have only learned about him second-hand, after the fact, when what he said before the fact was born out by the facts. A source we recently listened to was the BBC Radio program called Analysis (listen to or download the March 24 program).
Minsky; Levy Economics Institute |
While fancy economists were building their mathematical 'models' of economic behavior, which were very intricate and detailed, ordinary people and the bankers who misled them were venturing hither and thither for the quick kills. Minsky, basically out of the mainstream, was warning in less technical but actually far more relevant and correct ways that stability builds instability. As the Levy Economics Institute described his ideas in brief,
Minsky held that, over a prolonged period of prosperity, investors take on more and more risk, until lending exceeds what borrowers can pay off from their incoming revenues. When overindebted investors are forced to sell even their less-speculative positions to make good on their loans, markets spiral lower and create a severe demand for cash—an event that has come to be known as a "Minsky moment."In the recent crisis, confidence in quick-profit investments was so great that people became careless and built their hopes and McMansions of sand. When what amounted to a grand, expanding Ponzi scheme finally collapsed, disaster struck for many (except those who could use the legal system to basically buy their way out of going to jail).
Minsky was just independent-thinking enough to be definitely out of what policy and university circles generally tolerate, and had died before the 2008 crash so he never saw his ideas vindicated. They were subsequently adopted with post hoc enthusiasm, of course, by the very same prophets whose wisdom had led us to what actually happened (that is they didn't lose their university, bank, or think-tank jobs). Minsky is now apparently appearing with some prominence in new editions of economics textbooks (the idea of publishing books is perhaps a sign of total professional shamelessness, but that's another story).
On the radio discussion, the point was made that the Professionals, those Who Know have become ever more enamored of computer modeling, mathematical theory, simulations, and all the paraphernalia of technical 'science'. In our highly risk-averse, technophilic, bureaucratized world, this passes for wisdom rather than soft-headed mainly verbal arguments (like Minsky's). If you want to be published, get tenure or reach the next step on the think-tank or Wall Street status ladder, you better be very technical, and do things very narrowly and with elegant mathematics. That that doesn't work, and it's known that it doesn't work, doesn't seem to matter ("well, it will work this time!").
This is a characteristic of our culture in our scientific age. Reduction to technicality is what our institutions, reporters, governments, funders, advisors, and the like admire. And that viewpoint has its tentacles elsewhere, too.
The same in evolution and genetics.
Like 19th century economists, Charles Darwin gave biologists their version of the truth. It was a very broad theory, based on the traits of organisms. This was what counted, not the underlying biological mechanism of the traits. The argument was conceptual, with an implied quantitative basis. Darwin actually viewed it as a mathematical theory much as Newton's theory of universal gravitation, but the mathematical details were unimportant.
Many scientists want to formalize such theory to give it support and the elegance of mathematics, but in fact, Darwin's own idea about the underlying basis ('gemmules' and 'pangenesis') was basically wrong. Evolutionary theory proceeded well without any such basis and, indeed, today most biologists don't know or understand the mathematical claimant for the theory (called population genetics).
What the last 50 years have done is to attempt to reduce evolution to molecular and mathematical precision. In particular, as genomic technologies have themselves evolved as dramatically as anything that ever happened to life, there has been a love-affair, or infatuation, with technology as if it were answering the basic questions about life. Genetics does, indeed, illuminate many fundamentals about some aspects of life, but as we and many others have written extensively, it does not provide the global or precise kind of prediction that physics-envy would suggest. Still, despite many facts being ignored or dismissed, such as the often poor predictive power from genotype to trait, contrary to the unstated causal assumption of genes as the fundamental 'instructions' of life, an enormous superstructure based on molecular and computer technology is being built on countless studies of minute details. Again, what we are seeing is reduction to technicality.
Hiding behind minutiae
Both areas shared the same sort of retreat to the depths of minutiae to establish their apparent profundity of understanding, wisdom, and influence. Over-arching larger-scale understanding, rather unrelated to much of the minutiae, gets no attention: it's not technical and hence not glamorous enough. It sounds deeply important and so both the professions themselves and those who report their activities to the general public, and those who provide the funds for these activities, are impressed, buffaloed, intimidated, or otherwise persuaded. But the diving into technical minutiae is a kind of bathos, that often does not seem to be much constrained by, or basically just bypasses, what we know and may even be obvious (as in economics).
These are just two areas in which one can draw some parallels. They are undoubtedly widespread across many areas of our society, in science, semi-science, the arts and so on. It does seem to be true that every culture has its traits, or themes, or belief systems. In ours, it's a belief in technology and in particular computing technology. Technology changes our lives, mainly for the better. But that it can solve many technical problems does not mean it leads to greater understanding. Mathematics, despite Galileo's claim that it's the language with which God wrote the universe, is fantastically useful and precise when you can write equations whose assumptions are sufficiently accurate for your needs. It can lead to outcomes that can be tested specifically.
But if the number and sorts of assumptions and structures (e.g., equations) that are constructed yield exact outcomes, those outcomes really are nothing more than the rewording of the assumptions. That is, the deductions are contained within the assumptions and structures one choose to begin with. There is no guarantee that the deductions represent the real world, unless the assumptions do. Indeed, inaccuracies in assumptions and choice of structures can easily lead to unconstrained inaccuracies in the deductions, relative to the actual world. The appearance of elegance and insight can be illusory even in theory. (We might note here that the current kerfuffle over attempts to reinstate scientific racism also exemplify this kind of selective invocation of technical details or methods, while ignoring of more general countervailing facts that are well-known or obvious.)
This formal testability of mathematical predictions is often equated to--or confused with--proof of the assumptions on which it is based. But they are assumptions, and if they are inaccurate your results will be precisely inaccurate. Even matching predictions under such circumstances can, but need not, imply underlying truth. This assumed to be causal can be correlated with what's truly causal, for example.
Further, when mathematical models and theories are thought to be precisely true--that is, assumed to be so--results from actual studies will rarely match predictions perfectly. There will be human measurement and other technical errors, for example. So how do we deal with these? We use statistical or other sorts of tests, to judge whether the results match the predictions. As we've written about before, we must rely on subjectively chosen tests of adequacy, like statistical significance level. Superficial aspects of truth may pass such tests in a convincing way, but that doesn't mean the deeper, broader truths are being understood.
Worse than assuming that deviation of results from predictions are just technical errors, is the natural tendency to design studies and interpret results, in obliviousness to or willing ignoring of countervailing knowledge or facts. We do this all the time in science, even though we shouldn't. Economists pretended everyone was a rational, perceptive value-calculating machine, when it was manifestly obvious that we are not. Evolutionary geneticists assume Nature is a perfect screening machine, when it manifestly is not.
Verbal arguments can be global and true, but are not so easy to turn into specific predictions, hence their lower status than high-level technology. But ultimately science rests on verbal--conceptual--understanding. Clearly in both economics and genetics (and who knows how many other fields?), we are in love with technology and use it for many reasons, delving deeper than our actual understanding allows. Often that will generate findings or surprising facts that stimulate broader thinking, but just as often even scientists, enmeshed in the daily routine (rut?) of our careers, have a hard time telling the difference.
We're human and we need our self-respect, sense of importance, salaries and retirement benefits, ego-stroking, and just plain sense that we are doing something of value and importance to our fellow humans. We are all vulnerable to overlooking or circumventing deeper truths by hiding in minutiae that masquerade as truth, in order to attain those needs. It happens all the time. Usually, it doesn't matter very much. But when misplaced claims of insight are uttered too charismatically, intercalate into too many societal vested interests, or are taken too seriously, then society can be in for a very rough ride to pay for its credulousness. None of this is new, but if we are creatures who learn from experience, why don't we, or can't we, learn from our long history?
We are products of our culture. One law of Nature may be that we cannot over-ride that law.
This reduction to technicality exemplified by theoretical population genetics - the core of the Modern Synthesis - was inherited by evolutionary ecology, which gave rise not only to evolutionary psychology but also evolutionary game theory, optimal foraging theory, life history theory, a number of optimization models based on genic reductionism. Plasticity, when recognized, was assumed to be adaptive; the organism a rational maximizer. While this approach has been fruitful for certain questions and certain species, it has its limits - especially for behaviorally flexible, socially learning species like humans.
ReplyDeleteOne might be bold enough to assert that if humans evolved to be anything behaviorally it was to assess circumstances and make decisions. How genomes enable this is an important question, but the _not_ being behaviorally hard-wired would seem to have manifest fitness advantages and make a species much less vulnerable to changing circumstances.
DeleteSomeone else also predicted the financial crisis :)
ReplyDeletehttp://www.safehaven.com/article/2423/the-flation-debate
http://www.safehaven.com/article/2478/nonlinearity-memory-effect-and-the-markets
In fact, some of my predictions (massive deflation) are still in the future. The central bankers are destroying billions of dollars to fight what is inevitable.