Tuesday, October 29, 2013

This just in: genes 'for' disease might be an outdated concept!

By Anne Buchanan and Ken Weiss

Science writer David Dobbs has a nice piece over at Slate this week, wrapping up the American Society of Human Genetics meetings that just took place in Boston.  Yes, he writes, the field of Human Genetics can report successes finding genes for rare diseases, and sequencing genomes has gotten fast and (relatively) cheap.  But the overall story is one of "struggle and confusion."  Reports of genes for [fill-in-the-blank] can't be replicated, some sequencing methods are missing causal mutations, and indeed, "[t]here's a real possibility that "the majority of cancer predisposition genes in databases are wrong.""  This is a time of growing humility, Dobbs writes, as geneticists recognize that the promise of the human genome as portrayed by Francis Collins and Craig Venter et al. in 2000 has not panned out.

At the same time, a new paper in Trends in Genetics ("A century after Fisher: time for a new paradigm in quantitative genetics", Nelson, Pettersson and Carlborg, 23 Oct, 2013), tweeted and retweeted on Oct 28, argues that, well, biology is complex. 
Despite easy access to commercial personal genetics services, our knowledge of the genetic architecture of common diseases is still very limited and has not yet fulfilled the promise of accurately predicting most people at risk. This is partly because of the complexity of the mapping relationship between genotype and phenotype that is a consequence of epistasis (gene-gene interaction) and other phenomena such as gene-environment interaction and locus heterogeneity. Unfortunately, these aspects of genetic architecture have not been addressed in most of the genetic association studies that provide the knowledge base for interpreting large-scale genetic association results. 
This is not new news!
But, we ask, why is any of this news??  In response to Dobbs's tweets on Oct 28 about his Slate piece, Jason Moore reminded Dobbs about a paper he and and Scott Williams published in 2009 in which they argued, "based on the emerging data and analyses, that elucidating the genetic architecture of breast cancer and comparable diseases must focus on underlying complexity." Another paper about epistasis.

It's perhaps self-serving of us to point this out, but there have been plenty more, and much earlier warnings about complexity.  If it seems strictly like vanity, we have pointed out that these publications noted that the basic theoretical understanding, and data to support it, had been known for close on a century.

For example, in the days when sequencing a single gene was still news, we were involved in one of the first projects to document variation, the extent of which was surprising at the time -- but shouldn't be any longer.  That was a cardiovascular disease project and here's just one of the papers that resulted, documenting variation in the human lipoprotein lipase (LPL) gene.  This was one of the first publications to alert the human genetics community that the genetics of disease was going to be more complex than people were expecting.  Published in Nature Genetics, the news was not hidden.  

In 2000, Weiss and Terwilliger argued in a commentary, again in Nature Genetics ("How many diseases does it take to map a gene with SNPs?"), that the common disease/common variant model then current was based on faulty understanding of population genetics and biology. It was, they said,
...fuelled by a faith that the genetic determinants of complex traits are tractable, and that knowledge of genetic variation will materially improve the diagnosis, treatment or prevention of a substantial fraction of cases of the diseases that constitute the major public health burden of industrialized nations. Much of the enthusiasm is based on the hope that the marginal effects of common allelic variants account for a substantial proportion of the population risk for such diseases in a usefully predictive way. A main area of effort has been to develop better molecular and statistical technologies, often evaluated by the question: how many SNPs (or other markers) do we need to map genes for complex diseases? We think the question is inappropriately posed, as the problem may be one primarily of biology rather than technology.
Note that this paper was published 13 years ago.  Terwilliger has said numerous times that it could be published again today, without changing a word.  In fact, we'll post the conclusion from the paper here, as it's still entirely a propos
Resistance to genetic reductionism is not new, and we know that, by expressing these views (some might describe them as heresies), we risk being seen as stereotypic nay-sayers. However,ours is not an argument against genetics, but for a revised genetics that interfaces more intimately with biology. Biological traitshave evolved by noise-tolerant evolutionary mechanisms, and atrait that doesn't manifest until long after the reproductive lifespan of most humans throughout history is unlikely to be genetic in the traditional, deterministic sense of the term. Most genetic studies that focus on humans are designed, in effect, to mimic Mendel’s choice of experimental system, with only two or three frequent states with strongly different effects. That certainly enables us to characterize some of the high-penetrance tail of distribution of the allelic effects, but as noted above these may usually be rather rare. But inflated claims based on this approach can divert attention from the critical issue of how to deal with complexity on its own terms, and fuel false hopes for simple answers to complex questions. The problems faced in treating complex diseases as if they were Mendel's peas show, without invoking theterm in its faddish sense, that 'complexity' is a subject that needs its own operating framework, a new twenty-first rather than nineteenth or even twentieth century genetics.
Here's a figure from that paper, showing the logic of causation, and gene mapping strategy.

From Weiss and Terwilliger, Nat Genet, 2000
Schematic model of trait aetiology. The phenotype under study, Ph, is influenced by diverse genetic, environmental and cultural factors (with interactions indicated in simplified form). Genetic factors may include many loci of small or large effect, GPi, and polygenic background. Marker genotypes, Gx, are near to (and hopefully correlated with) genetic factor, Gp, that affects the phenotype. Genetic epidemiology tries to correlate Gx with Ph to localize Gp. Above the diagram, the horizontal lines represent different copies of a chromosome; vertical hash marks show marker loci in and around the gene, Gp, affecting the trait. The red Pi are the chromosomal locations of aetiologically relevant variants, relative to Ph.
That commentary wasn't hidden either, published as it was in Nature Genetics.   And a paper by Weiss and Clark a few years later ("Linkage disequilibrium and the mapping of complex human traits", Trends in Genetics, 1 Jan, 2002), argued that the then-current idea of common variants/common disease was not based on "sound population genetic principles".  They showed that there were good, long-known reasons why the idea was simplistic and wouldn't pan out as a way to find genes for many common diseases, as it proved not to.

Indeed, in Ken's own Cambridge Press book published in 1993 Genetic Variation and Human Disease, the situation was also rather clearly laid out, even at the beginning of the mapping era.  Of course, we weren't the only people writing about this, but the important point is that the issue is not new, and the reasons for expecting things to be more or less as we have found them were stated for the right reasons---reasons that go back to the early 1900s and are often attributed to RA Fisher's famous 1918 paper on polygenic inheritance that in many ways laid the foundation for the modern evolutionary synthesis that united Mendelian inheritance and Darwinian evolution.  Others, less routinely credited, were just as clearly pointing out the multilocus causation of complex traits.

We've published in print and on this blog a number of commentaries on complexity, the perils of genetic reductionism, and the need to move beyond Mendel.  So why is it taking so long for the profession to learn, or to recognize, the situation?

Why the deaf ear?
The answer is complex and we've described the influence of Mendelian thinking (of single gene causation, among other things), the hope for simple, easy success with dramatic effects, the way this played into the grant system, the lobbying for funding, and (importantly) the evolving tractability of experimental approaches.  Humans, even purportedly objective scientists, hear what they are attuned to hear, and reject or dismiss or mention only in passing things that are inconvenient.

If you do sequencing or genotyping and you know how to collect piles of genetic data and analyze it, and you get your grants by doing that (and, directly or subtly promising cures), and have a staff whose jobs you want to protect (including your own medical school salary), and the media reward bold announcements, or you're an NIH project officer with a grant portfolio to protect, then you won't hear the music of the spheres.  In many ways, the dog (state-of-the-art technology) is wagging the tail (questions and interpretation).

Also, and to be at least somewhat fair, you will retrospectively cite the successes that clearly do arise from this approach, even if they may mainly be only partial.  You will justify business as usual by the argument that serendipity has led to many major scientific advances so if we keep pouring resources into this particular approach, until the next generation of fancy equipment is available etc., that some lucky discoveries will be made (you don't point out that if we instead focused resources on what really is 'genetic', we would also make serendipitous discoveries).

So you manufacture exciting! news.  We have this or that mystery!  It's the hidden heritability, for example, and the solution is that it's all copy number variation!, or rare not common variants!, or will be found in whole genome sequencing!, or it's epigenomics!, or proteomics, or the microbiome!, or you-name-it. 

More of the same can be predicted
Science usually works in this way, especially in our industrial-scale business-model world. The hunger for the secret key is quite understandable. No one will or perhaps no one can, slow down and think first, and lobby only afterwords.
 
As is usually the case in human society, our profession will move towards some new set of claims, when like a school of fish something comes along and it's where the funding will lie.  Meanwhile, we have a long list of known causes that could have major public health benefits if they were as aggressively pursued (e.g., the fact that most diseases being mapped are more environmentally than genomically caused, the fact that we have hundreds of single-locus traits for which real concentrated effort might lead to cures or prevention, etc.).

But can there be something better? 
From the meeting: 

Dobbs writes, "If the field is currently a bit lost in the fog, whoever clears the air could become to Watson and Crick as Watson and Crick were to Darwin."  But to a great extent, the air has been cleared!  While genes 'for' most common complex diseases aren't being found, and we still can't predict who'll get what sometime down the road, we've learned a tremendous amount in the last several decades -- some, mostly rare, mostly pediatric diseases are caused by single genes, but most traits are complex, polygenic, caused by no one pathway, and most likely involve gene by environment causation.

Is rethinking just a word, in an area that really will simply yield to an increase in the scale of operations?  Or is some truly new conceptualization in order, and if it is, what might it be?

We'll comment on this and make some suggestions next time.

4 comments:

  1. It is really unfortunate that you guys have been completely sidelined by all these #ENCODE 'scientists'. Your 2000 paper was full of wisdom and I cannot believe something like that had been hiding in plain sight over 13 years.

    http://www.homolog.us/blogs/blog/2013/07/30/battle-over-gwas-ken-weiss-edition/

    ReplyDelete
  2. We only mentioned a few people whose work has been pointing out the complexity issue, and realize that one person we forgot to mention is Charlie Sing, at Michigan, who has been making points of this sort for many, many years (and was a co-author on the first Nature Genetics paper we mentioned in this post). Unfortunately, we can't remember or list everyone who has felt similarly, but this should not give the impression that it is anything close to the majority or even the (right or wrong) consensus.

    ReplyDelete
  3. I would love to see a MT post on the complexities of tumors, and the usefulness of running every possible -omics assay on them in an effort to predict prognosis and treatment choices. This seems to be the next "big thing" in genetics, -omics, and also seems to ignore what seems to be almost impossible complexity; given the heterogeneity of tumors themselves, the fact that they change constantly, and often involve metastatic lesions (which often are the lethal 'agent'), which themselves develop their unique -omics patterns. Add to this the fact that even when patients respond to antineoplastic agents it is usually only temporary, at the tumor(s)/metastatic lesions develop resistance.

    ReplyDelete
    Replies
    1. I would say that amidst the hyperbole, and there's a lot of it, there do seem to be some genes that, when mutated in a tumor, seem to usefully affect the effectiveness of specific therapies. I'm not up to date, but even if it's not perfect it could be a life-saver.

      However, more generally what you say must be true, in principle. The mutability of lineages within a given tumor and its descendants must lead to resistance. I think this is very well established and as I understand it is one of the reasons for quick (and perhaps heavy) doses of mixed chemotherapy agents at the same time, on the grounds that it's unlikely a given cancer cell could quickly enough evolve double resistance..

      If mutation-specific targeting becomes possible, perhaps the mutations that generate the primary transformed cell will target the whole descendant tumor (and its metastases). But even then, one would expect occasional resistant sub-lineages. At least, they will have to become harmful from a small starting set, giving a chance for using a different therapy.

      Another advantage, if all sites can be sampled in choice of agents to use, is that any evolved advantage or resistance dies with the tumor or patient. That's unlike antibiotic or herbicide resistance gene strategies, because weeds, pests, and infectious organisms get 'out there' to proliferate.

      Delete