Tuesday, July 31, 2018

Thinking about science upon entering the field. IV. Finale

Here is the fourth and final of a four-part series of posts by Tristan Cofer, a graduate student in chemical ecology here at Penn State.  He has been thinking about the profession he is being trained for, and the broader setting in which it is taking place, and into which he will have a place:

For my final entry in this series, I would like to revisit some ideas from my earlier posts, as they pertain to a book that I recently finished, called ‘What is Real?’ (Basic Books, 2018) by Adam Becker. The book recounts quantum theory’s formative years during the early twentieth century, focusing as much on the personalities that were involved in the theory’s development as on the science itself.

Becker devotes much of the book to the 1927 Solvay Conference, which gathered twenty-nine of the world’s leading physicists to discuss the newly formulated theory. Attendees at the conference were divided into two ideologically distinct groups. In the majority, were Werner Heisenberg, Max Born, and others who had adopted Danish physicist Niels Bohr’s ‘Copenhagen interpretation’.

Influenced by Heisenberg’s ‘uncertainty principle’, Bohr claimed that subatomic entities had ‘complementary’ properties that could never be measured at the same time. Electrons, for example, behaved like ‘particles’ or ‘waves’ depending on the experiment. To Bohr, this implied that electrons, photons, and other subatomic entities only had probabilities until they were measured. ‘Reality’ simply did not exist in the quantum world. It was therefore pointless to talk about what was happening on the quantum level, since quantum theory could not describe the way the world ‘is’.

On the other side of the aisle were Louis de Broglie, Erwin Schrödinger, and Albert Einstein who were adamant that physical systems were ‘real’ whether we acknowledged them or not. Led by Einstein, this group argued that although considerable advances had been made in developing quantum theory, it was hardly complete. Rather than do away with reality at the quantum level, Einstein et al. suggested that hidden processes, such as de Broglie’s ‘pilot waves’, could explain apparent contradictions such as wave–particle duality.

In the end, Bohr’s instrumentalist view won the day over Einstein’s realist one. Quantum mechanics was a closed theory that was no longer susceptible to change. Einstein and his supporters were largely ignored, and Einstein himself was painted as an out-of-touch curmudgeon who simply would not accept the new theory. At least that is how the story has been told over the past several decades. Becker, however, gives a slightly different account. He argues that the Copenhagen interpretation’s popularity had less to do with its epistemological value than with the cult of personality surrounding its architect, Niels Bohr.

Bohr was a ‘physicists’ physicist’ and the preeminent scientist of his time. In contrast to Einstein (who described himself as a ‘one-horse cart’), Bohr collaborated with other physicists throughout his career and mentored many others at his institute in Copenhagen, where he enjoyed considerable financial support from the Danish government. According to Becker, Bohr’s social influence, together with the convoluted and sometimes confusing way that he expressed himself, led many to revere him as a near mythical figure. Indeed, in one particularly telling passage, Becker quotes Bohr’s former student John Archibald Wheeler, who compared Bohr to ‘Confucius and Buddha, Jesus and Pericles, Erasmus and Lincoln’.

‘What is Real?’ serves as an important cautionary tale. While we want to believe that science advances only through its devotion to empirical fact, many ‘facts’ are decided upon not by what they say, but by who says them. We each belong to a ‘thought collective’ with fixed ideas that prevent us from seeing things objectively. Competing ideologies are quickly swept under the rug and forgotten. Indeed, in my experience, I have found that students are rarely exposed to the histories and philosophies that have shaped their respective disciplines. Do we all have our own ‘Copenhagen interpretation’, firmly enshrined in a scaffolding of tradition and convenience? I suspect that we do. To borrow a line from Daniel C. Dennett’s, ‘Darwin’s Dangerous Idea’: ‘There is no such thing as philosophy-free science; there is only science whose philosophical baggage is taken on board without examination’.

Sunday, July 15, 2018

The problems are in physics, too!

We write in MT mainly about genetics and how it is used, misused, perceived, and applied these days.  That has been our own profession, and we've hoped to make cogent critiques that (if anybody paid any attention) might lead to improvement.  At least, we hope that changes could lead to far less greed, costly herd-like me-too research, and false public promises (e.g., 'precision genomic medicine')--and hence to much greater progress.

But if biology had problems, perhaps physics, with its solid mathematical foundation for testing theory, might help us see ways to more adequate understanding.  Yes, we had physics-envy!  Surely, unlike biology, the physical sciences are at least mathematically rigorous.  Unlike biology, things in the physical cosmos are, as Newton said in his famous Principia Mathematica, replicable: make an observation in a local area, like your lab, and it would apply everywhere.  So, if the cosmos has the Newtonian property of replicability, and the Galilean property of laws written in the language of mathematics, properties that were at the heart of the Enlightenment-period's foundation of modern science, then of course biologists (including even the innumerate Darwin) have had implicit physics envy.  And for more than a century we've thus borrowed concepts and methods in the hopes of regularizing and explaining biology in the same way that the physical world is described.  Not the least of the implications of this is a rather deterministic view of evolution (e.g., of force-like natural selection) and of genetic causation.

This history has we think often reflected a poverty of better fundamental ideas specific to biology.  Quarks, planets, and galaxies don't fight back against their conditions, the way organisms do!  Evolution, and hence life, are, after all, at the relevant level of resolution, fundamentally based on local variation and its non-replicability.  Even Darwin was far more deterministic in a physics-influenced way, than a careful consideration of evolution and variation warrants--and the idea of 'precision genomic medicine', so widely parroted by people who should know better (or who are fadishly chasing funds), flies in the face of what we actually know about life and evolution, and the fundamental differences between physics and biology.

Or so we thought!
Well, a fine new book by Sabine Hossenfelder, called Lost in Math, has given us a reality check if ever there was one.

In what is surely our culpable over-simplification, we would say that Hossenfelder shows that at the current level of frontier science, even physics is not so unambiguously mathematically rigorous as its reputation would have us believe.  Indeed, we'd say that she shows that physicists sometimes--often? routinely?--favor elegant mathematics over what is actually known.  That sounds rather similar to the way we favor simple, often deterministic ideas about life and disease and their evolution, based on statistical methods that assume away the messiness that is biology.  Maybe both sciences are too wedded to selling their trade to the public?  Or are there deeper issues about existence itself?

Hossenfelder eloquently makes many points about relevant ways to improve physics, and many are in the category of the sociology or 'political economics' of science--the money, hierarchies, power, vested interests and so on.  These are points we have harped on here and elsewhere, in regard to the biomedical research establishment.  She doesn't even stress them enough, perhaps, in regard to physics.  But when careers including faculty salaries themselves depend on grants, and publication counts, and when research costs (and the 'overhead' they generate) are large and feed the bureaucracy, one can't be surprised at the problems, nor that as a result science itself, the context for these socioeconomic factors, suffers.  Physics may require grand scale expenses (huge colliders, etc.) but genetics has been playing copy-cat for decades now, in that respect, entrenching open-ended Big Data projects.  One can debate--we do debate--whether this is paying off in actual progress.

Science is a human endeavor, of course, and we're all vain and needy.  Hossenfelder characterizes these aspects of the physics world, but we see strikingly similar issues in genomics and related 'omics areas.  We're sure, too, that physicists are like geneticists in the way that we behave like sheep relative to fads, while only some few are truly insightful.  Perhaps we can't entirely rid ourselves of the practical, often fiscal distractions from proper research.  But the problems have been getting systematically and palpably worse in recent decades, as we have directly experienced.  This has set the precedent and pattern for strategizing science, to grab long-term big-cost support, and so on.  Hossenfelder documents the same sorts of things in the physics world.

Adrift in Genetics
In genetics, we do not generally have deterministic forces or causation.  Genotypes are seen as determining probabilities of disease or other traits of interest.  It is not entirely clear why we have reached this state of affairs.  For example, in Mendel's foundational theory, alleles at genes (as we now call them) were transmitted with regular probabilities, but once inherited their causative effects were deterministic.  The discovery of the genetics of sexual reproduction, one chromosome set inherited from each parent, and one set transmitted to each offspring, showed why this could be the case.  The idea of independent, atomic units of causation made sense, and was consistent with the developing sciences of physics and chemistry in Mendel's time as he knew from lectures he attended in Vienna.

However, Mendel carefully selected clearly segregating traits to study, and knew not all traits behaved this way.  So an 'atomic' theory of biological causation was in a sense following 19th century science advances (or fads), and was in that sense forced onto selective data.  It was later used to rationalize non-segregating traits by the 'modern evolutionary synthesis' of the early 1900s.  But it was a theory that, in a sense, 'atomized' genetic causation in a physics-like way, with essentially the number of alleles being responsible for the quantitative value of a trait in the organism.  This was very scientific in the sense of science at the time.

Today, by contrast, the GWAS approach treats even genetic causation itself, not just its transmission, as somehow probabilistic.  The reasons for this are badly under-studied and often rationalized, but might in reality be at the core of what would be a proper theory of genetic causation.  One can, after the fact, rationalize genotype-based trait 'probabilities', but this is in deep ways wrong: it borrows from  physics the idea of replicability, and then equates retrospective induction (the results in a sample of individuals with or without a disease, for example), with prospective risks.  That is, it tacitly assumes a kind of causally gene-by-gene deterministic probability.  One deep fallacy in this is that a gene's effects can be isolated, but genes are in themselves inert: only by interacting do DNA segments 'do' anything.  Far worse, one may say epistemologically worse if not fatal, is that we know that future conditions in life, unlike those in the cosmos, are not continuous, deterministic, or predictable.

That is, extending induction to deduction is tacitly assumed in genomics, but is an unjustified convenience.  Indeed, we know the prevalence of traits like stature or disease changes with time, and along with literally unpredictable future lifestyle exposures and mutations.  So assuming a law-like extensibility from induction to deduction is neither theoretically or practically justifiable.

But to an extent we found quite surprising, being naive about physics, what we do in crude ways in genetics much resembles how physics rationalizes its various post hoc models to explain the phenomena outlined in Hossenfelder's book.  Our behavior seems strikingly similar to what Lost in Math shows about physics, but perhaps with a profound difference.

Lost in statistics
Genetic risk is expressed statistically (see polygenic risk scores, e.g.).  Somehow, genotypes affect not the inevitability but the probability that the bearer will have a given trait or disease.  Those are not really probabilities, however, but retrospective averages estimated by induction (i.e., from present-day samples that reflect past-experience).  Only by equating induction with deduction, and averages with inherent parameters, indeed, that take the form of probabilities, can we turn mapping results into 'precision' genomic predictions (which seems to assume, rather nonsensically, that the probability is a parameter that can be measured with asymptotic precision).

For example, if a fraction p of people with a given genotype in our study, have disease x, there is no reason to think that they were all at the same 'risk', much less that in some future sample the fraction will be same.  So, in what sense, in biology at least, is a probability an inherent parameter?  If it isn't, what is the basis of equating induction with deduction even probabilistically?

There is, we think, an even far deeper problem.  Statistics, the way we bandy the term about, is historically largely borrowed from the physical sciences, where sampling and measurement issues affect precision--and, we think profoundly, phenomena are believed to be truly replicable.  I'd like to ask Dr Hossenfelder about this, but we, at least, think that statistics developed in physics largely to deal with measurement issues when rigorous deterministic parameters were being estimated.  Even in quantum physics probabilities seem to be treated as true underlying parameters at least in the sense of being observational aspects of measuring deterministic phenomena (well, don't quote us on this!).

But these properties are [sic] precisely what we do not have in biology.  Biology is based on evolution which is inherently based on variation and its relation to local conditions over long time periods.  This does not even consider the vagaries of (sssh!) somatic mutation, which makes even 'constitutive' genotypes, the basic data of this field, an illusion of unknowable imprecision (e.g., it differs uniquely with individual, age, tissue, and environmental exposure).

In this sense, we're also Lost in Statistics.  Our borrowing of scientific notions from the history of physical sciences, including statistics and probability, is a sign that we really have not yet developed an adequate much less mature theory of biology.  Physics envy, even if physics was not Lost in Math, is the result of the course of science history, a pied piper for the evolutionary and genetic sciences.  It is made worse by the herd-like behavior of human activities, especially under the kinds of careerist pressures that have been built into the academic enterprise.  Yet the profession seems not even to recognize this, much less seriously to address it!

Taking what we know in biology seriously
The problems are real and while they'll never be entirely fixed, because we're only human, they are deeply in need of reform.  We've been making these points for a long time in relation to genetics, but perhaps naively didn't realize similar issues affected the fields of physics which appear, at least to the outsider, much more rigorous.

Nonetheless, we do think that the replicability aspects of physics, even with its frontier uncertainties, make it more mathematically--more parametrically--tractable compared to evolution and genetics, because the latter depend on non-replication.  This is fundamental, and we think suggests the need for really new concepts and methods, rather than ones essentially borrowed from physics.

At a higher and more profound, but sociological level, one can say that the research enterprise is lost in much more than math.  It will never be perfect; perhaps it can be perfected, but that may require much deeper thinking than even physics requires.

This is just our view: take a serious look at Hossenfelder's  assessment of physics, and think about it for yourself.