Friday, November 1, 2013

A better theory for biology: time for our own Solvay conferences?

In the early 19th century, it was dawning on many investigators that life could be explained in material rather than religious creationist ways.  All sorts of speculations, some wildly wrong some partially right, filled the air.  This set the stage for the two greats of the century, Mendel and Darwin.  Mendel's theory showed  how inheritance worked as the transmission of discontinuous causal particles.  Darwin's theory showed how life as a material history connected these particles in a largely continuous flow of change.  Remarkably, these two ideas also accounted in principle for the complex, highly organized traits that characterized organisms.

But upon reflection the two views had major gaps, and they seemed inconsistent with each other.  Biological change was very gradual, and seemed basically continuous, but particulate inheritance caused 'jumps' that were hard to reconcile with gradual evolution.  It wasn't until the 1930s that they were reconciled in the 'modern evolutionary synthesis'.  The molecular theory was genetic, the evolutionary part, population genetics, was statistical.  In a sense these two views were consistent, but had a kind of separate duality, or complementarity that may ironically have been paralleling dualities, particulate and statistical, in physics, that were occurring at the same time (did they have any conceptual influence on biologists?).  And the duality was also widely treated as being separate--quantitative geneticists did their work, such as agricultural breeding, without regard to evolution or specific genes, while Mendelians probed genes with clear effects and, like Mendel, largely bypassed the quantitative traits that weren't as amenable to experimental study.

Still, the two aspects of the synthesis together provided a remarkable, unified and consistent theory for the nature of life, its diversity, history, origins and continuity.  Many genes with small effects could together account for complex traits, while each gene was separately inherited as, indeed, a particulate cause. A gene with major effects would 'segregate' as a separate cause, but this would not be so for complex traits.

These theories were powerfully generic, but, remarkably, they said nothing at all about what traits would evolve, or how many species of what size, or how adapted--and to what!  But without yet knowing about DNA or about how DNA worked to bring about development, things remained largely indirect, many levels above the actual molecules.  This changed with the genetic and genomic technologies. But the theory has not really kept up with many things that over recent decades the new data have revealed (some of which we listed in our first post in this series 4 days ago).

Has 20th Century theory been exhausted?
Indeed, we argue that perhaps what this theory can do for us has, because of the very advances it enabled, reached the point of diminishing returns.  Perhaps it is too 'meta' for the phenomena, too indirect about the nature of causation, too descriptive of generalities.  Much of what is done these days is largely about documenting more cases, and more details, revealing the predictable complexities but not explaining them, or their generic nature, well enough.  It is not paying back commensurately with the increasing cost of adding each new fillip to our knowledge.  How many times do we have to hear the frustrated acknowledgment that life is 'complex', where that is offered almost as if it were an explanation and is usually followed by a plea for larger, longer, more, and more costly work of the same sort?  GWAS wasn't enough: we need whole sequence data on everybody!

It is in this sense that we believe it is time for our science to dig down deeper into the nature of biological causality, and that we should at least consider whether some basically new way of viewing life may be possible, and perhaps even discernible, if we struggle appropriately with the facts as we know them.

There is natural resistance to any such suggestion, by the community of scientists plying the current waters who are dependent on the current institutionalized research system, which they know and that pays them.  More generally, in most areas of human endeavor, not just science, people are comfortable with their current worldview and fear or resist potentially threatening new ones.  In this case, such feelings may include the argument that we can't order up genius, and that openly declaring the importance of searching might threaten the comfortable way of life we know.  We personally think, however, that facing the limits of the now century-old worldview in biology could be productive even if, yes, it would threaten business as usual, it should threaten it.  Yes, your university wants you to keep the tap flowing, and that system is hard to resist.  But if momentum can shift, the community will adapt.

Solvay for biology?
In the early 1900s, a series of conferences, called Solvay Conferences after their benefactor, were held on occasions to discuss the most fundamental advances in physics.  The greats of the 20th century regularly attended and major fundamental discoveries were presented and debated.  That was in part a reflection of the profoundly original thinking and truly 'paradigm shifting', not just incremental, advances were made.

There has been some considerable back-chatter triggered by our recent series of posts on this subject.  Tweets have reflected various reactions, from the enthusiastic to the irritated, dismissive, or defensive.  But there are now some correspondents who are suggesting that it may be time for biology to face up to the explanatory ceiling our theory of life has reached, to assess what ideas may be afoot to enable biologists to delve deeper or generalize more effectively.

Perhaps through public social media like blogs, tweets, and other kinds of interactions, a virtual 'Solvay' conference can be held, an ongoing, explicit attempt not to promote our own work but to assess where we are--to face up to the anomalies or clear facts that do not fit without a major shoe-horn into the current framework that is being exhausted.

Actual physical meetings are also possibilities, where these issues can be seriously discussed, not as a mass-meeting of thousands the way most scientific association meetings are, nor routine symposia, but in some format that really knocks hard at the door of better understanding.

Any such effort will generate a plethora of new ideas of all sorts.  But if it's done right, we would have to realize that the challenge is serious, and most of the ideas will be wrong.  None may be correct. Perhaps current theory will survive all challenge, or perhaps the time is not yet ripe.  But if there's no effort to bang on the door, the door won't be opened.  Maybe, just maybe, something profound will emerge.

To the extent that we have been in on the wave of thinking about such possibilities, or could use our MT blog as a part of the process, then to that extent the pot-stirring we have tried to do has achieved its objectives.  But time will tell.  There is a black hole of gravity to be overcome: the gravitational pull of the grant and publication system, and the usual resistance to change.   We'll see how things fall out.


Ed Hollox said...

Very interesting post. I remain optimistic and think most academics, good ones, keep the "bread and butter" research going, plying the current waters as you put it, while being self-critical and clinging on to the flotsam of bigger, broader ideas. In the end, we just have to keep on thinking.

Ken Weiss said...

Yes, and I guess my view is that the system does not put pressure on us to think more deeply. It is too competitive, too driven by money and score-counting, and too driven by grant cycles. Reform is possible though it can't guarantee really new thinking...but it could provide proper soil for it.

Bread and butter is the excuse we use, or perhaps we're forced by the system to use, to keep in business, but I believe it drains real creativity and, I must say, that the bread-and-butter argument is used as an excuse for not being or even trying to be more creative in one's thinking. That is too risky, in our current system. And students are taught that from Day 1. What we sow is what we reap.

Ed Hollox said...

Very relevant to this, an interesting interview with Peter Higgs in the Guardian ...

Ken Weiss said...

Thanks much for pointing this out. Higgs is supposed to be a modest, good person. In my earlier days, prominent members of my profession (bioanthropology) may only have published a book or few papers, yet maintained (properly) their standing and influence (even if few or no grants!).

But we, at least in my generation, let this monster grow over the past few decades, because it was a good, munificent system when it opened up (the grant-based hyper-careerism).

Now, it may require a 'Solvay' approach, with those whose personalities or situations, and abilities, allow them to do it, to buck the gravitational pull of the business-as-usual black hole.