Thursday, November 8, 2018

The horseshoe crab and the barnacle: induction vs deduction in evolution

Charles Darwin had incredible patience.  After his many-year, global voyage on the HMS Beagle, he nestled in at Down House, where he was somehow able to stay calm and study mere barnacles to an endless extent (and to write 4--four--books on these little creatures).  Who else would have had the obsessive patience (or independent wealth and time on one's hands) to do such a thing?

Image result for darwin barnacles
      From Darwin's books on barnacles (web image capture)
Darwin's meticulous work and its context in his life and thinking are very well described in Rebecca Stott's compelling 2003 book, Darwin and the Barnacle, which I highly recommend, as well as the discussion of these topics in Desmond and Moore's 1991 Darwin biography, The Life of a Tormented Evolutionist.  These are easier, for seeing the points I will describe here, than plowing through Darwin's detailed own tomes (which, I openly confess, I have only browsed).  His years of meticulous barnacle study raised many questions in Darwin's mind, about how species acquire their variation, and his pondering this eventually led to his recognition of 'evolution' as the answer, which he published only years later, in 1859, in his Origin of Species.

Darwin was, if anything, a careful and cautious person, and not much given to self-promotion.  His works are laden with appropriate caveats including, one might surmise, careful defenses lest he be found to have made interpretive or theoretical mistakes.  Yet he dared make generalizations of the broadest kind.  It was his genius to see, in the overwhelming variation in nature, the material for understanding how natural processes, rather than creation events, led to the formation of new species.  This was implicitly true of his struggle to understand the wide variation within and among species of barnacles, variation that enabled evolution, as he later came to see. Yet the same variation provided a subtle trap:  it allowed escape from accusations of undocumented theorizing, but was so generic that in a sense it made his version of a theory of evolution almost unfalsifiable in principle.

But, in a subtle way, Mr Darwin, like all geniuses, was also a product of his time.  I think he took an implicitly Newtonian, deterministic view of natural selection.  As he said, selection could detect the 'smallest grain in the balance' [scale] of differences among organisms, that is, could evaluate and screen the tiniest amount of variation.  He had, I think, only a rudimentary sense of probability; while he often used the word 'chance' in the Origin, it was in a very casual sense, and I think that he did not really think of chance or luck (what we call genetic drift) as important in evolution.  This I would assert is widely persistent, if largely implicit, today.

One important aspect of barnacles to which Darwin paid extensive attention was their sexual diversity.  In particular, many species were hermaphroditic.  Indeed, in some species he found small, rudimentary males literally embedded for life within the body of the female.  Other species were more sexually dichotomous.  These patterns caught Darwin's attention.  In particular, he viewed this transect in evolutionary time (our present day) as more than just a catalog of today, but also as a cross-section of tomorrow.  He clearly thought that what we saw today among barnacle species represented the path that other species had taken towards becoming the fully sexually dichotomous (independent males and females) in some species today: the intermediates were on their way to these subsequent stages.

This is a deterministic view of selection and evolution: "an hermaphrodite species must pass into a bisexual species by insensibly small stages" from single organisms having both male and female sex organs to the dichotomous state of separate males and females (Desmond and Moore: 356-7).

But what does 'must pass' mean here?  Yes, Darwin could array his specimens to show these various types of sexual dimorphism, but what would justify thinking of them as progressive 'stages'?  What latent assumption is being made?  It is to think of the different lifestyles as stages along a path leading to some final inevitable endpoint.

If this doesn't raise all sorts of questions in your mind, why not?  Why, for example, are there any intermediate barnacle species here today?  Over the eons of evolutionary time why haven't all of them long ago reached their final, presumably ideal and stable state?  What justifies the idea that the species with 'intermediate' sexuality in Darwin's collections are not just doing fine, on their way to no other particular end?  Is something wrong with their reproduction?  If so, how did they get here in the first place?  Why are there so many barnacle species today with their various reproductive strategies (states)?

Darwin's view was implicitly of the deterministic nature of selection--heading towards a goal which today's species show in their various progressive stages.  His implicit view can be related to another, current controversy about evolution.

Rewinding the tape
There has for many recent decades been an argument about the degree of directedness or, one might say, predictability in evolution.  If evolution is the selection among randomly generated mutational variants for those whose survival and reproduction are locally, at a given time favored, then wouldn't each such favored path be unique, none really replicable or predictable?

Not so, some biologists have argued!  Their view is essentially that environments are what they are, and will systematically--and thus predictably--favor certain kinds of adaptation.  There is, one might quip, only one way to make a cake in a particular environment.  Different mutations may arise, but only those that lead to cake-making will persist.  Thus, if we could 'rewind the tape' of evolution and go back to way back when, and start again, we would end up with the same sorts of adaptations that we see with the single play of the tape of life that we actually have. There would, so to speak, always be horseshoe crabs, even if we started over.  Yes, yes, some details might differ, but nothing important (depending, of course, on how carefully you look--see my 'Plus ça ne change pas', Evol. Anthropol, 2013, a point others have made, too).


Others argue that evolution is so rooted in local chance and contingency, that there would be no way to predict the details of what would evolve, could we start over at some point.  Yes, there would be creatures in each local niche, and there would be similarities to the extent that what we would see today would have to have been built from what genetic options were there yesterday, but there the similarity would end.

Induction, deduction, and the subtle implications of the notion of 'intermediate' forms
Stott's book,  Darwin and the Barnacle, discusses Darwin's work in terms of the presumed intermediate barnacle stages he found.  But the very use of such terms carries subtle implications. It conflates induction with deduction, it assumes what is past will be repeated.  It makes of evolution what Darwin also made of it: a deterministic, force-like phenomenon.  Indeed, it's not so different from a form of creationism.

This has deeper implications.  Among them are repeatability of environments and genomes, at least to the extent that their combination in local areas--life, after all, operates strictly on local areas--will be repeated elsewhere and else-times.  Only by assuming not only the repeatability of environments but also of genomic variation, can one see in current states of barnacle species today stages in a predictable evolutionary parade.  The inductive argument is the observation of what happened in the past, and the deductive argument is that what we see is intermediate, on its way to becoming what some present-day more 'advanced' stage is like.

This kind of view, which is implicitly and (as with Darwin) sometimes explicitly invoked, is that we can use the past to predict the future.  And yet we routinely teach that evolution is by its essential nature locally ad hoc and contingent, based on random mutations and genetic drift--and not driven by any outside God or other built-in specific creative force.

And 'force' seems to be an apt word here.

The idea that a trait found in fossils, that was intermediate between some more primitive state and something seen today, implies that a similar trait today could be an 'intermediate stage' today for a knowable tomorrow, conflates inductive observation with deductive prediction.  It may indeed do so, but we have no way to prove it and usually scant reason to believe it.  Instead, equating induction with deduction tacitly assumes, usually without any rigorous justification, that life is a deductive phenomenon like gravity or chemical reactions.

The problem is serious: the routine equating of induction with deduction gives a false idea about how life works, even in the short-term.  Does a given genotype, say, predict a particular disease in someone who carries it, because we find that genotype associated with affected patients today?  This may indeed be so, especially if a true causal reason is known; but it cannot be assumed to be.  We know this from well-observed recent history: Secular trends in environmental factors with disease consequences have indeed been documented, meaning that the same genotype is not always associated with the same risk.  There is no guarantee of a future repetition, not even in principle.

Darwin's worldview
Darwin was, in my view, a Newtonian in his view.  That was the prevailing science ethos in his time.  He accepted 'laws' of Nature and their infinitesimally precise action.  That Nature was law-like was a prevailing, and one may say fashionable view at the time. It was also applied to social evolution, for example, as in Marx's and Engels' view of the political inevitability of socialism.  That barnacles can evolve various kinds of sexual identities and arrangements doesn't mean any of what Darwin observed in them was on the way to full hermaphrodism or even later to fully distinct sexes...or, indeed, to any particular state of sexuality.  But if you have a view like his, seeing the intermediate stages even contemporaneously, would reinforce the inevitabilistic aspect of a Newtonian perspective, and seemingly justify using induction to make deductions.

Even giants like Darwin are products of their times, as all we peons are.  We gain comfort from equating deduction with induction, that the past we can observe allows us to predict the future.  That makes it comfortingly safe to make assertions, the feeling that we understand the complex environment in which we must wend our way through life.  But in science, at least, we should know the emptiness of the equation of the past with the future.  Too bad we can't seem to see further.

Friday, October 19, 2018

Nyah, nyah! My study's bigger than your study!!

It looks like a food-fight at the Precision Corral!  Maybe the Big Data era is over!  That's because what we really seem to need (of course) is even bigger GWAS or other sorts of enumerative (or EnumerOmics studies, because then (and only then) will we really realize how complex traits are caused, so that we can produce 'precision' genomic medicine to cure all that ails us.  After all, there no such thing as enough 'data' or a big (and open-ended) enough study.  Of course, because so much knowledge....er, money....is at stake, such a food-fight is not just children in a sand box, but purported adults, scientists even, wanting more money from you, the taxpayer (what else?).  The contest will never end on its own.  It will have to be ended from the outside, in one way or another, because it is predatory: it takes resources away from what might be focused, limited, but actually successful problem-solving research.

The idea that we need larger and larger GWAS studies, not to mention almost any other kind of 'omics enumerative study, reflects the deeper idea that we have no idea what to do with what we've got.  The easiest word to say is "more", because that keeps the fiscal flood gates open.  Just as preachers keep the plate full by promising redemption in the future--a future that, like an oasis to desert trekkers, can be a mirage never reached, scientists are modern preachers who've learned the tricks of the trade.  And, of course, since each group wants its flood gates to stay wide open it must resist any even faint suggestion that somebody else's gates might open wider.

There is a kind of desperate defense, as well as food fight, over the situation.  This, at least, is one way to view a recent exchange between an assertion by Boyle et al. (Cell 169(7):1177-86, 2018**)  that some few key genes perhaps with rare alleles scattered across the genome are the 'core' genes responsible for complex diseases, but that lesser often indirect or incidental genes across the genome provide other pathways to affect a trait, and are detected in GWAS.  If a focus on this model were to take place, it might threaten the gravy train of more traditional, more mindless, Big Data chasing. As a plea to avoid that is Wray et al.'s falsely polite spitball in return (Cell 173:1573-80, 2018**)  urging that things really are spread all over the genome, differently so in everyone.  Thus, of course, the really true answer is some statistical prediction method, after we have more and even larger studies.

Could it be, possibly, that this is at its root merely a defense of large statistical data bases and Big Data per se, expressed as if it were a legitimate debate about biological causation?  Could it be that for vested interests, if you have a well-funded hammer everything can be presented as if it were a nail (or, rather, a bucket's worth of nails, scattered all over the place)?

Am I being snide here? 
Yes, of course. I'm not the Ultimate Authority to adjudicate about who's right, or what metric to use, or how many genome sites, in which individuals, can dance on the head of the same 'omics trait.  But I'm not just being snide.  One reason is that both the Boyle and Wray papers are right, as I'll explain.

The arguments seem in essence to assert that complex traits are due either to many genetic variants strewn across the genome, or to a few rare larger-effect alleles here and there complemented by nearby variants that may involve indirect pathways to the 'main' genes, and that these are scattered across the genome ('omnigenic').  Or that we can tinker with GWAS results and various technical measurements from them to get the real truth?

We are chasing our tails these days in an endless-seeming circle to see who can do the biggest and most detailed enumerative study, to find the most and tiniest of effects, with the most open-ended largesse, while Rome burns.  Rome, here, are the victims of the many diseases which might be studied with actual positive therapeutic results by more, focused, if smaller, studies.  Or, in many cases, by a real effort at revealing and ameliorating the lifestyle exposures that typically, one might say overwhelmingly, are responsible for common diseases.

If, sadly, it were to turn out that there is no more integrative way, other than add-'em-up, by which genetic variants cause or predispose to disease, then at least we should know that and spend our research resources elsewhere, where they might do good for someone other than universities.  I actually happen to think that life is more integratively orderly than its effects typically being enumeratively additive, and that more thoughtful approaches, indeed reflecting findings of the decades of GWAS data, might lead to better understanding of complex traits.  But this seemingly can't be achieved by just sampling extensively enough to estimate 'interactions'.  The interactions may, and I think probably, have higher-level structure that can be addressed in other ways.

But if not, if these traits are as they seem, and there is no such simplifying understanding to be had, then let's come clean to the public and invest our resources in other ways to improve our lives before these additive trivia add up to our ends when those supporting the work tire of exaggerated promises.

Our scientific system, that we collectively let grow like mushrooms because it was good for our self interests, puts us in a situation where we must sing for our supper (often literally, if investigators' salary depends on grants).  No one can be surprised at the cacophony of top-of-the-voice arias ("Me-me-meeeee!").  Human systems can't be perfect, but they can be perfected.  At some point, perhaps we'll start doing that.  If it happens, it will only partly reflect the particular scientific issues at issue, because it's mainly about the underlying system itself.


**NOTE: We provide links to sources, but, yep, they are paywalled --unless you just want to see the abstract or have access to an academic library.  If you have the looney idea that as a taxpayer you have already paid for this research so private selling of its results should be illegal--sorry!--that's not our society.

Thursday, October 18, 2018

When is a consistent account in science good enough?

We often want our accounts in science to be consistent with the facts.  Even if we can't explain all the current facts, we can always hope to say, truthfully, that our knowledge is imperfect but our current theory is at least largely true....or something close to that....until some new 'paradigm' replaces it.

It is also only natural to sneer at our forebears' primitive ideas, of which we, naturally, now know much better.  Flat earth?  Garden of Eden?  Phlebotomy?  Phlogiston?  Four humors?  Prester John, the mysterious Eastern Emperoro who will come to our rescue?  I mean, really!  Who could ever have believed such nonsense?

Prester John to the rescue (from Br Library--see Wikipedia entry)
In fact, leaders among our forebears accepted these and much else like it, took them as real, sought them for solace from life's cares not just because they were promised (as in religious figures) but as earthly answers.  Or, to seem impressively knowledgeable, found arcane ways to say "I dunno" without admitting it.  And, similarly, many used ad hoc 'explanations' for personal gain--as self-proclaimed gurus, promisers of relief from life's sorrows or medical woes (usually, if you cross their palms with silver first).

Even in my lifetime in science, I've seen forced after-the-fact 'explanations' of facts, and the way a genuine new insight can show how wrong those explanations were, because the new insight accounts for them more naturally or in terms of some other new facts, forces, or ideas.  Continental drift was one that had just come along in my graduate school days.  Evolution, relativity, and quantum mechanics are archetypes of really new ideas that transformed how our forebears had explained what is now our field of endeavor.

Such lore, and our more broad lionizing of leading political, artistic or other similarly transformative figures, organizes how we think.  In many ways it gives us a mythology, or ethnology, that leads us to order success into a hierarchy of brilliant insights.  This, in turn, and in our careerist society, provides an image to yearn for, a paradigm to justify our jobs, indeed our lives, make them meaningful--make them important in some cosmic sense, and really worth living.

Indeed, even ordinary figures from our parents, to the police, generals, teachers, and politicians have various levels of aura as idols or savior figures, who provide comforting answers to life's discomfiting questions.  It is natural for those burdened by worrisome questions to seek soothing answers.

But of course, all is temporary (unless you believe in eternal heavenly bliss).  Even if we truly believe we've made transformative discoveries or something like that during our lives, we know all is eventually dust.  In the bluntest possible sense, we know that the Earth will some day destruct and all our atoms scatter to form other cosmic structures.

But we live here and now and perhaps because we know all is temporary, many want to get theirs now, and we all must get at least some now--a salary to put food on the table at the very least.  And in an imperfect and sometimes frightening world, we want the comfort of experts who promise relief from life's material ills as much as preachers promise ultimate relief.  This is the mystique often given to, or taken by, medical professionals and other authority figures.  This is what 'precision genomic medicine' was designed, consciously or possibly just otherwise, to serve.

And we are in the age of science, the one True field (we seem to claim) that delivers only objectively true goods; but are we really very different from those in similar positions of other sorts of lore?  Is 'omics any different from other omnibus beliefs-du-jour?  Or do today's various 'omical incantations and promises of perfection (called 'precision') reveal that we are, after all, even in the age of science, only human and not much different from our typically patronized benighted forebears?

Suppose we acknowledge that the latter is, at least to a considerable extent, part of our truth.  Is there a way that we can better use, or better allocate, resources to make them more objectively dedicated to solving the actually soluble problems of life--for the public everyday good, and perhaps less used, as from past to today, to guild the thrones of those making the promises of eternal bliss?

Or does sociology, of science or any other aspect of human life, tell us that this is, simply, the way things are?

Wednesday, October 17, 2018

The maelstrom of science publishing: once you've read it, when should you shred it?

There is so much being published in the science literature--a veritable tsunami of results.  New journals are being started almost monthly, it seems, and mainly or only by for-profit companies.  There seems to be a Malthusian growth of the number of scientists, which has certainly produced a genuine explosion of research and knowledge, but the intense pressure on scientists to publish has perhaps changed the relative value of every paper.

And as I look at the ancient papers (that is, ones from 2016-17) that I've saved in my Must-Read folder, I see all sorts of things that, if they had actually been widely read, much less heeded, would mean that many papers being published today might not seem so original.  At least, new work might better reflect what we already know--or should know if we cared about or read that ancient literature.

At least I think that, satire aside, in the rush to publish what's truly new, as well as for professional score-counting and so on, and with the proliferating plethora of journals, the past is no longer prologue (sorry, Shakespeare!) as it once was and, one can argue should still be.  The past is just the past; it doesn't seem to pay to recognize, much less to heed it, except for strategic citation-in-passing reasons and because bibliography software can be used to winnow out citable papers so that reviewers of papers or grant applications won't be negative because their work wasn't cited.  You can judge for yourself whether this is being realistic or too cynical (perhaps both)!

The flux of science publishing is enormous for many reasons.  Not least is the expansion in the number of scientists.  But this is exacerbated by careerist score-counting criteria that have been growing like the proverbial Topsy in recent decades: the drive to get grants, big and bigger, long and longer.  Often in biomedical sciences, at least, grants must include investigator salaries, so there is massive self-interest in enumerable 'productivity'.  The journals proliferate to fill this market, and of course to fill the coffers of the publishers' self-interest.  Too cynical?

Over the years, in part to deflate Old Boy networks, 'objective' criteria have come to include, besides grants garnered, a faculty member's number of papers, ranking of the journals they're in, citation counts, and other 'impact factor' measures.  This grew in some ways also to feed the growing marketeering by vendors, even who provide score-counting tools, and university bureaucracies.  More generally, it reflects the way middle-class life, the life most of us now lead, has become--attempts to earn status, praise, wealth, and so on by something measurable and therefore ostensibly objectiveToo cynical?  

Indeed, it is now common for graduate students--or even undergrads--to attend careerism seminars.  Instruction in how to get published, how to get funded, how to work the System.  This may be good in a sense, or at least realistic, even if it was not so when, long ago, I was a graduate student.  It does, however, put strategizing rather than science up front, a first-year learning priority.  One wonders how much time is lost that, in those bad old days, was spent thinking and learning about the science itself.  We were, for example, to spend our 2-year Master's program learning our field, only then to get into a lab and do original work, which was what a PhD was about.  It is fair to ask whether this is just a change in our means of being and doing, without effect on the science itself, or whether careerism is displacing or even replacing really creative science?  When is objection to change nothing more than nostalgic cynicism?

Is science more seriously 'productive' than it used to be?
Science journals have always been characterized largely by the minutiae they publish, because (besides old boy-ism) real, meaty, important results are hard to come by.  Most observation in the past, and experiment these days, yields little more than curios.  You can see this by browsing decades-old volumes even of the major science journals.  The reports may be factually correct, but of minimal import.  Even though science has become a big industry rather than the idle rich's curiosity, most science publishing now, as in the past, might more or less still be vanity publishing.  Yet, as science has become more of a profession, there are important advances, so it is not clear whether science is now more splash than substance than it was in the past.

So, even if science has become an institutionalized, established, middle-class industry, and most of us will go down and out, basically unknown in the history of our fields, that has probably always been the case.  Any other view probably is mainly retrospective selective bias: we read biographies of our forebears, making them seem few and far between, and all substantial heroes; but what we are reading is about those forebears who really did make a difference.  The odd beetle collector is lost to history (except maybe to historians, who themselves may be making their livings on arcane minutiae).  So if that's just reality, there is no need to sneer cynically at it.

More time and energy are taken up playing today's game than was the case, or was necessary, in the past--at least I think that is pretty clear, if impossible to prove.  Even in the chaff-cloud, lasting knowledge does seem to be much more per year than it used to be.  That seems real, but it reveals another reality.  We can only deal with so much.  With countless papers published weekly, indeed many of them reviews (so we don't have to bother reading the primary papers), overload is quick and can be overwhelming.

That may be cynical, but it's also a reality.  My Must-Read folder on my computer is simply over-stuffed, with perhaps a hundred or more papers that I 'Saved' every year.  When I went to try to clean my directory this morning, I was overwhelmed: what papers before, say, 2015 are still trustworthy, as reports or even as reviews of then-recent work?  Can one even take reviews seriously, or cite them or past primary papers without revealing one's out-of-dateness?  New work obviously can obsolesce prior reviews. Yet reviews make the flood of prior work at least partially manageable.  But would it be safer just to Google the subject if it might affect one's work today?  It is, at least, not just cynicism to ask.

Maybe to be safe, given this situation, there would be two solutions:
1.  Just Google the subject and get the most recent papers and reviews; 
2.  There should be software that detects and automatically shreds papers in a Science Download directory, that haven't had any measurable impact in, say, 5 or (to be generous) 10 years.  We already have sites like Reddit, whose contents may not have a doomsday eraser.  But in science, to have mercy on our minds and our hard discs, what we need is Shred-it!

Tuesday, October 16, 2018

Where has all the thinking gone....long time passing?

Where did we get the idea that our entire nature, not just our embryological development, but everything else, was pre-programmed by our genome?  After all, the very essence of Homo sapiens compared to all other species, is that we use culture--language, tools, etc.--to do our business rather than just our physical biology.  In a serious sense, we evolved to be free of our bodies, our genes made us freer from our genes than most if not all other species! And we evolved to live long enough to learn--language, technology, etc.--in order to live our thus-long lives.

Yet isn't an assumption of pre-programming the only assumption by which anyone could legitimately promise 'precision' genomic medicine?  Of course, Mendel's work, adopted by human geneticists over a century ago, allowed great progress in understanding how genes lead at least to the simpler of our traits, with discrete (yes/no) manifestations, traits that do include many diseases that really, perhaps surprisingly, do behave in Mendelian fashion, and for which concepts like dominance and recessiveness been applied and that, sometimes, at least approximately hold up to closer scrutiny.

Even 100 years ago, agricultural and other geneticists who could do experiments, largely confirmed the extension of Mendel to continuously varying traits, like blood pressure or height.  They reasoned that many genes (whatever they were, which was unknown at the time) contributed individually small effects.  If each gene had two states in the usual Aa/AA/aa classroom example sense, but there were countless such genes, their joint action could approximate continuously varying traits whose measure was, say, the number of A alleles in an individual.  This view was also consistent with the observed correlation of trait measure with kinship-degree among relatives.  This history has been thoroughly documented.  But there are some bits, important bits, missing, especially when it comes to the fervor for Big Data 'omics analysis of human diseases and other traits.  In essence, we are still, a century later, conceptual prisoners of Mendel.

'Omics over the top: key questions generally ignored
Let us take GWAS (genomewide association studies) on their face value.  GWAS find countless 'hits', sites of whatever sort across the genome whose variation affects variation in WhateverTrait you choose to map (everything simply must be 'genomic' or some other 'omic, no?).  WhateverTrait varies because every subject in your study has a different combination of contributing alleles.  Somewhat resembling classical Mendelian recessiveness, contributing alleles are found in cases as well as controls (or across the measured range of quantitative traits like stature or blood pressure), where the measured trait reflects how many A's one has: WhateverTrait is essentially the sum of A's in 'cases', which may be interpreted as a risk--some sort of 'probability' rather than certainty--of having been affected or of having the measured trait value.

We usually treat risk as a 'probability,' a single value, p, that applies to everyone with the same genotype.  Here, of course, no two subjects have exactly the same genotype so some sort of aggregate risk score, adding up each person's 'hits', is assigned a p.  This, however, tacitly assumes something like that each site contributes some fixed risk or 'probability' of affection.  But this treats these values as if they were essential to the site, each thus acting as a parameter of risk.  That is, sites are treated as a kind of fixed value or, one might say 'force', relative to the trait measure in question.

One obvious and serious issue is that these are necessarily estimated from past data, that is, by induction from samples.  Not only is there sampling variation that usually is only crudely estimated by some standard statistical variation-related measure, but we know that the picture will be at least somewhat different in any other sample we might have chosen, not to mention other populations; and those who are actually candid about what they are doing know very well that the same people living in a different place or time would have different risks for the same trait.

No study is perfect, so we use some conveniently assumed well-behaved regression/correction adjustments to account for the statistical 'noise' due to factors like age, sex, and unmeasured environmental effects.  Much worse than these issues, there are clearly factors of imprecision, and the obvious major one, taboo even to think about much less to mention, that relevant future factors (mutations, environments, lifestyles) are unknowable, even in principle.  So what we really do, are forced to do, is extend what the past was like to the assumed future.  But besides this, we don't count somatic changes (mutation arising in body tissues during life, that were not inherited), because they'd mess up our assertions of 'precision', and we can't measure them well in any case (so just shut one's eyes and pretend the ghost isn't in the house!).

All of these together mean that we are estimating risks from imperfect existing samples and past life-experience, but treating them as underlying parameters so that we can extend them to future samples.  What that does is equate induction with deduction, assuming the past is rigorously parametric and will be the same in the future;  but this is simply scientifically and epistemologically wrong, no matter how inconvenient it is to acknowledge this.  Mutations, genotypes, and environments of the future are simply unpredictable, even in principle.

None of this is a secret, or new discovery, in any way.  What it is, is inconvenient truth. These things should have been enough, by themselves and without badgering investigators about environmental factors that (we know very well, typically predominate) prevent all the NIH's precision promises from being accurate ('precise'), or even to a knowable degree.   Yet this 'precision' sloganeering is being, sheepishly, aped all over the country by all sorts of groups who don't think for themselves and/or who go along lest they get left off the funding gravy train.  This is the 'omics fad.  If you think I am being too cynical, just look at what's being said, done, published, and claimed.

These are, to me, deep flaws in the way the GWAS and other 'omics industries, very well-heeled, are operating these days, to pick the public's pocket (pharma may, slowly, be awakening-- Lancet editorial, "UK life science research: time to burst the biomedical bubble," Lancet 392:187, 2018).  But scientists need jobs and salaries, and if we put people in a position where they have to sing in this way for their supper, what else can you expect of them?

Unfortunately, there are much more serious problems with the science, and they have to do with the point-cause thinking on which all of this is based.

Even a point-cause must act through some process
By far most of the traits, disease or otherwise, that are being GWAS'ed and 'omicked these days, at substantial public expense, are treated as if the mapped 'causes' are point causes.  If there are n causes, and a person has an unlucky set m out of many possible sets, one adds 'em up and predicts that person will have the target trait.  And there is much that is ignored, assumed, or wishfully hidden in this 'will'.  It is not clear how many authors treat it, tacitly, as a probability vs a certainty, because no two people in a sample have the same genotype and all we know is that they are 'affected' or 'unaffected'.

The genomics industry promises, essentially, that from conception onward, your DNA sequence will predict your diseases, even if only in the form of some 'risk'; the latter is usually a probability and despite the guise of 'precision' it can, of course, be adjusted as we learn more.  For example, it must be adjusted for age, and usually other variables.  Thus, we need ever larger and more and longer-lasting samples.  This alone should steer people away from being profiteered by DNA testing companies.  But that snipe aside, what does this risk or 'probability' actually mean?

Among other things, those candid enough to admit it know that environmental and lifestyle factors have a role, interacting with the genotype if not, usually, overwhelming it, meaning, for example, that the genotype only confers some, often modest, risk probability, the actual risk much more affected by lifestyle factors, most of which are not measured or not measured with accuracy, or not even yet identified.  And usually there is some aspect that relates to age, or some assumption about what 'lifetime' risk means.  Whose lifetime?

Aspects of such a 'probability'
There are interesting issues, longstanding issues, about these probabilities, even if we assume they have some kind of meaning.  Why do so many important diseases, like cancers, only arise at some advanced age?  How can a genomic 'risk' be so delayed and so different among people?  Why are mice, with very similar genotypes to humans (which is why we do experiments on them to learn about human disease) only live to 3 while we live to our 70s and beyond?

Richard Peto, raised some of these questions many decades ago.  But they were never really addressed, even in an era when NIH et al were spending much money on 'aging' research including studies of lifespan.  There were generic theories that suggested from an evolutionary theory why some diseases were deferred to later ages (it is called 'negative pleiotropy'), but nobody tried seriously to explain why that was from a molecular/genetic point of view.  Why do mice only live only 3 years, anyway?  And so on.

These are old questions and very deep ones but they have not been answered and, generally, are conveniently forgotten--because, one might argue, they are inconvenient.

If a GWAS score increases the risk of a disease, that has a long delayed onset pattern, often striking late in life, and highly variable among individuals or over time, what sort of 'cause' is that genotype?  What is it that takes decades for the genes to affect the person?  There are a number of plausible answers, but they get very little attention at least in part because that stands in the way of the vested interests of entrenched too-big-to-kill Big Data faddish 'research' that demands instant promises to the public it is trephining for support.  If the major reason is lifestyle factors, then the very delayed onset should be taken as persuasive evidence that the genotype is, in fact, by itself not a very powerful predictor.

Why would the additive effects of some combination of GWAS hits lead to disease risk?  That is, in our complex nature why would each gene's effects be independent of each other contributor?  In fact, mapping studies usually show evidence that other things, such as interactions are important--but they are at present almost impossibly complex to be understood.

Does each combination of genome-wide variants have a separate age-onset pattern, and if not, why not?  And if so, how does the age effect work (especially if not due to person-years of exposure to the truly determining factors of lifestyle)?  If such factors are at play, how can we really know, since we never see the same genotype twice? How can we assume that the time-relationship with each suspect genetic variant will be similar among samples or in the future?  Is the disease due to post-natal somatic mutation, in which case why make predictions based on the purported constitutive genotypes of GWAS samples?

Obviously, if long delayed onset patterns are due not to genetic but to lifestyle exposures interacting with genotypes, then perhaps lifestyle exposures should be the health-related target, not exotic genomic interventions.  Of course, the value of genome-based prediction clearly depends on environmental/lifestyle exposures, and the future of these exposure is obviously unknowable (as we clearly do know from seeing how unpredictable past exposures have affected today's disease patterns).

The point here is that our reliance on genotypes is a very convenient way of keeping busy, bringing in the salaries, but not facing up to the much more challenging issues that the easy one (run lots of data through DNA sequencers) can't address.  I did not invent these points, and it is hard to believe that at least the more capable and less me-too scientists don't clearly know them, if quietly.  Indeed, I know this from direct experience.  Yes, scientists are fallible, vain, and we're only human.  But of all human endeavors, science should be based on honesty because we have to rely on trust of each other's work.

The scientific problems are profound and not easily solved, and not soluble in a hurry.  But much of the problem comes from the funding and careerist system that shackles us.  This is the deeper explanation in many ways.  The  paint on the House of Science is the science itself, but it is the House that supports that paint that is the real problem.

A civically responsible science community, and its governmental supporters, should be freed from the iron chains of relentless Big Data for their survival, and start thinking, seriously, about the questions that their very efforts over the past 20 years, on trait after trait, in population after population, and yes, with Big Data, have clearly revealed.

Monday, October 8, 2018

Evolution, to Engels--and a kind of lesson for us all?

We tend to think of Friedrich Engels as Karl Marx's pal, co-author, supporter--and financial benefactor.  That's all true.  But he was also perhaps a better synthesizer of ideas, and certainly a more approachable author.  A core aspect of their economic idea was that, through historic processes, the nature of societies evolve, from simple states of our early human forebears ultimately to come to rest as communism.  I am no expert but I think that since there would then (they thought) no longer be opposition or competition, history would, so to speak, come to an end.  At least the uninterrupted history of unfairness, conflict, strife, and governments that oppressed their people. A good discussion of Engels' life can be heard in a recent broadcast/podcast presented recently on BBC Radio.

That Beeb program went over many things about Engels that are familiar to anthropologists, among others.  But it ended by referring to a work I'd not known of, a partly unfinished book on science called The Dialectics of Nature, which is available on line or as a pdf.  The latter has an Introduction by JBS Haldane, one of the early 20th century's founding evolutionary geneticists, and a political leftist.

Image result for friedrich engels
Engels. (One of many versions on the web)

Engels discusses the various sciences as existed at the time (1883).  Haldane points out some errors that were known by his (Haldane's) time, but Engels' book is a surprisingly deep, broad review of science at his time.  I do not know how Engels knew so much science, but apparently he did.

Although Engels never completed it, the book was written only about 25 years after Darwin's Origin of Species, which to Engels was highly relevant to his views on society.  But he went much further! He viewed essentially everything, not just human society, as evolving phenomena.  While with various errors based on what was known at the time, he recognized astronomical change, geological evolution, and biological evolution as manifestations of the fundamental idea that things cosmic were not Created and thereafter remaining static, as prevailing biblically-derived views generally held, but had beginnings, and then changed.  Engels applied his ideas to inanimate physical phenomena as they were then understood, as well as to life itself. In essence, his view was that everything is about change, with human society as just another instance.

Engels was looking for what we might call universal 'laws', in this case concerning how systems change.  This would be a major challenge, by science, to the theologically based idea that once Created, worldly things were mainly constant.  Engels noted that the classic Greeks had had a more 'modern' and correct view of the dynamics of existence than western Europe had developed under the reign of the Church.

Engles' book shows how grand thinking had led to, or could be made consistent with, the social thinking by which Marx and Engels could believe that sociocultural evolution was similarly non-static.  If so, they claimed to see how societal dynamics would lead to future states in which the rather cruel, relatively primitive nature of nation states in his time would evolve to a fairer, more egalitarian kind of society.  But Dialectics of Nature shows that Engles was thinking very broadly and 'scientifically', in the sense of trying to account for things not just in terms of opinions or wishes, but of natural forces, and the resulting dynamics of change.  He wasn't the only one in his time who thought that the idea of an evolutionary process enabled one to predict its outcome--as seemed to be possible in physics and chemistry.

I am no Engels scholar, and I had no idea he was so knowledgeable about science as it stood at his time, nor that the idea of evolutionary change that he and Marx applied to society was, in a sense, based on the finding, in their view, of similar kinds of change in the physical cosmos.  This in a sense, conveniently made the extension of the theory to society seem quite logical, or perhaps even obvious, and as noted above, many were speculating in similar ways.  Marx and Engels scholars must be aware of this, but when I was exposed to these theories as an anthropology graduate student decades ago, I did not know of this connection between social and physical dynamics and evolution.

These alleged connections or similarities do not make the Marxist conclusions 'true', in the sense of scientific truth.  The idea that geology and species evolve may seem similar to the idea that societal structures evolve.  But just because two areas have some sort of similarity, or change over time and space, does not mean they have the same causes.  Human culture involves the physical aspects of a society's environment, but culture is largely or mainly about human interactions, beliefs, kinship, and so on.  There is no necessary physically causal or deep connection between that and species evolution or the growth and erosion of mountain ranges.  A planetary orbit, a hula hoop, and an orange are all more or less 'round', but that does not establish connections between them.

At the same time, Engels worked at the height, one might say, of the idea that there were universal 'laws of Nature'.  Darwin informally likened evolution to planetary motion, with law-like properties, and in some of his writing (e.g., about barnacles) he seems to have believed in a kind of selective inevitability--some species being, essentially, on the way to a terminal end found in related species (terminal, at least, as Darwin saw them in his time).  This may not be as benighted as it may seem.  Biologists still debate the question of what would happen if you could 'rewind the tape' of evolution, and start over.  Some have argued that you'd get the same result.  Others vigorously oppose this sort of belief in predictable destiny.

Given the ambience of science in the 19th century, and in the legacy of the 'Enlightenment' period in Europe only a century or two before, it is not surprising that Engels, wanting society also to be constrained by 'laws' or forces, and hence to be predictable if not leading to inevitable causal effects, would see parallels in the physical world.  Many others in that general time period in Europe had similar law-like ideas about societies. It is, at the very least, interesting that Engles tried to make his social ideas be as reflective of natural laws as are the orbits of planets.

What about us, today?
It is easy to look back and see what was 'in the air' in some past time, and how it influenced people, even across a spectrum of interest areas.  In this case, evolutionary concepts spanned the physical, biological, and social sciences.  We can see how very clever, insightful people were influenced by the ambient ideas.

So it's easy to look back and discern common themes, about which each person invoking them thought he was having specific, original insights.  But that's us looking back at them.  What about us in our own time?  How much might we, today, be affected by prevailing views--in scientific or societal affairs--that are 'in the air' but may not be as widely applicable as some argue that they are?  How many of our prevailing views, that we of course think of as modern and better than the more primitive ones of the past, are similarly just part of the ambience of our times, that will be viewed with patronizing smiles at our naiveté?  Does going with the flow, so to speak, of current tides make us see more deeply than our forebears--and how much is it just that we see things differently?

How can we know?

Saturday, October 6, 2018

And yet it moves....our GWAScopes and Galileo's lesson on reality

In 1633, Galileo Galilei was forced to recant before the Pope his ideas about the movement of the Earth, or else to face the most awful penalty.  As I understand the story, he did recant....but after leaving the Cathedral, he stomped his foot on the ground, and declared "And yet it moves!"  For various reasons, usually reflecting their own selfish vested interests, the powers that be in human society frequently stifle unwelcome truths, truths that would threaten their privileged well-being.  It was nothing new in Galileo's time--and it's still prevalent today.


Galileo: see Wikipedia "And yet it moves"
All human endeavors are in some ways captives of current modes of thinking--world-views, beliefs,  power and economic structures, levels of knowledge, and explanatory frameworks.  Religions and social systems often, or perhaps typically, constrain thinking. They provide comforting answers and explanations, and people feel threatened by those not adhering, not like us in their views.  The rejection of heresy applies far beyond formal religion.  Dissenters or non-believers are part of 'them' rather than 'us', a potential threat, and it is thus common if not natural to distrust, exclude, or even persecute them.

At the same time, the world is as the world really is, especially when it comes to the physical Nature.  And that is the subject of science and scientific knowledge.  We are always limited by current knowledge, of course, and history has shown how deeply that can depend on technology, as Galileo's experience with the telescope exemplifies.

When you look through a telescope . . . . 
In Galileo's time, it was generally thought or perhaps believed is a better word, that the cosmos was God's creation as known by biblical authority.  It was created in the proverbial Genesis way, and the earth--with we humans on it--was the special center of that creation.  The crystal spheres bearing the stars and planets, circled around and ennobled us with their divine light.  In the west, at least, this was not just the view, it was what had (with few exceptions) seemed right since the ancients.

But knowledge is often, if not perhaps always, limited by our senses, and they in turn are limited by our sensory technology.  Here, the classical example is the invention of the telescope, and eventually, what that cranky thinker Galileo saw through it.  Before his time, we had we had our naked eyes to see the sun move, and the stars seemed quite plausibly to be crystal spheres bearing twinkles of light, rotating around us.

If you don't know the story, Wikipedia or many other sources can be consulted. But it was dramatic!  Galileo's experience taught science a revolutionary lesson about reality vs myth and, very directly, about the importance of technology in our understanding of the world we live in.

The lesson from Galileo was that when you look through a telescope you are supposed to change your mind about what is out there in Nature.  The telescope lets you see what's really there--even if it's not what you wanted to see, or thought you'd see, or would be most convenient for you to see.


Galileo's telescope (imagined).  source: news.nationalgeographic.com
From Mendel's eyes to ours
Ever since antiquity, plant and animal breeders empirically knew about inheritance, that is, about the physical similarities between parents and offspring.  Choose parents with the most desirable traits, and their offspring will have those traits, at least, so to speak, on average.  But how does that work?

Mendel heard lectures in Vienna that gave him some notion of the particulate nature of matter.  When, in trying to improve agricultural yields, he noticed discrete differences, he decided do test their nature in pea plants which he knew about and were manageable subjects of experiments to understand the Molecular Laws of Life (my phrase, not his).

Analogies are never perfect, but we might say that Mendel's picking discrete, manageable traits was like pre-Newtonians looking at stars but not at what controlled their motion.  Mendel got an idea of how parents and offspring could resemble each other in distinct traits.   In a similar way that a telescope was the instrument that allowed Galileo to see the cosmos better, and do more observing than guessing, geneticists got their Galilean equivalent, in genomewide mapping (GWAS), which allowed us to do less guessing about inheritance and to see it better.  We got our GWAScope!

But what have we done with our new toy?   We have been mesmerized by gene-gazing.  Like Galileo's contemporaries who, finally accepting that what he saw really was there and not just an artifact of the new instrument, gazed through their telescopes and listed off this and that finding, we are on a grand scale just enumerating, enumerating, and enumerating.  We even boast about it.  We build our careers on it.

That me-too effort is not surprising nor unprecedented.  But it is also become what Kuhn called 'normal science'.  It is butting our heads upon a wall.  It is doing more and more of the same, without realizing that what we see is what's there, but we're not explaining it.  From early in the 20th century we had quantitative genetics theory--the theory that agricultural breeders have used in formal ways for that century, making traditional breeding that had been around since the discovery of agriculture, more formalized and empirically rigorous.  But we didn't have the direct genetic 'proof' that the theory was correct.  Now we do, and we have it in spades.

We are spinning wheels and spending wealth on simple gene-gazing.  It's time, it's high time, for some new insight to take us beyond what our GWAScopes can see, digesting and understanding what our gene-gazing has clearly shown.

Unfortunately, at present we have an 'omics Establishment that is as entrenched, for reasons we've often discussed here on MT, as the Church was for explanations of Truth in Galileo's time.  It is now time for us to go beyond gene-gazing.  GWAScopes have given us the insight--but who will have the insight to lead the way?