Showing posts with label sequencing technology. Show all posts
Showing posts with label sequencing technology. Show all posts

Tuesday, April 6, 2010

The efficacy of predicting gene prediction

Francis Collins has a track record of predicting tremendous advances in health from the human genome project --what he and many others said would be a revolution. Writing in the April 1 issue of Nature, which celebrates the 10th Anniversary of the Human Genome, he asks if the revolution that he and others foresaw more than a decade ago has now arrived. He writes that he never deletes a powerpoint file, so he knows exactly what he predicted when he stood on the podium with President Clinton in 2001 and announced the completion of the human genome sequence.
Predictive genetic tests will be available for a dozen conditions
Interventions to reduce risk will be available for several of these
Many primary-care providers will begin to practise genetic medicine
Preimplantation genetic diagnosis will be widely available, and its limits will be fiercely debated
A ban on genetic discrimination will be in place in the United States
Access to genetic medicine will remain inequitable, especially in the developing world
He says that it's fair to say that these predictions have come true. Well, yes, one can always declare victory after the fact. He also adds, "The promise of a revolution in human health remains quite real."  


But let's look at his predictions one at a time.  In fact, there are dozens of predictive genetic tests available -- but how accurate are they? Interventions to reduce risk of, say, obesity or type II diabetes are available, yes, but they've been known for decades, if not centuries -- diet and exercise. What he means by genetic medicine isn't really clear, and preimplantation genetic diagnosis can't be claimed to be due to HGP, but to technological advances, as well as the discovery of diseases for pediatric Mendelian diseases, which was already being done before the human genome sequence. A ban on genetic discrimination is in place (GINA), but how successful that will be is still an open question (and, as our friend, geneticist and tuba player Joe Terwilliger says, the differential rates 18 year old boys and girls pay for car insurance is genetic discrimination, isn't it?). And, well, predicting differential access to anything medical based on income disparities isn't exactly a challenge.


Collins in fact made many other predictions (or promises) for the HGP, particularly in the late 1990s as the human genome was nearing completion. For example, in a lecture he delivered in 1999 he presented a scenario for the year 2010 whereby a hypothetical young man, a smoker, is found to have high cholesterol. Here's a bit of that imagined scenario:
Aided by an interactive computer program that takes John's family history, his physician notes that there is a strong paternal history of myocardial infarction and that John's father died at the age of 48 years.
To obtain more precise information about his risks of contracting coronary artery disease and other illnesses in the future, John agrees to consider a battery of genetic tests that are available in 2010.  After working through an interactive computer program that explains the benefits and risks of such tests, John agrees (and signs informed consent) to undergo 15 genetic tests that provide risk information for illnesses for which preventive strategies are available.
John is please to learn that genetic testing does not always give bad news -- his risks of contracting prostate cancer and Alzheimer's disease are reduced, because he carries low-risk variants of the several genes known in 2010 to contribute to these illnesses. But John is sobered by the evidence of his increased risks of contracting coronary artery disease, colon cancer, and lung cancer. Confronted with the reality of his own genetic data, he arrives at that crucial "teachable moment" when a lifelong change in health-related behavior, focused on reducing specific risks, is possible. And there is much to offer. By 2010, the field of pharmacogenomics has blossomed, and a prophylactic drug regimen based on the knowledge of John's personal genetic data can be precisely prescribed to reduce his cholesterol level and the risk of coronary artery disease to normal levels...
Understandably, Collins was a great cheerleader for the genome project. But that is exactly the problem, and we regularly write about it: the life-as-lobbying worldview. Dr Collins did say that he believed that diseases would turn out to be caused by numerous genes with small effects, interacting with environmental factors, but he clearly believed that these genes would be common and identifiable (hence, the HapMap project, whose scenarios and promises did not materialize except by a lot of post hoc wriggling and redefinitions, such as of 'common').  And he clearly believed that understanding genetic effects would be straightforward -- and counteractable, with some help from the pharmaceutical industry.


In fact, the 'several genes' he predicted would be found by 2010 to be responsible for Alzheimer's disease turn out to be hundreds of genes, each with many alleles and mostly with tiny effects. And, although he did write that lifestyle changes were a component of prevention, and that John should quit smoking, he also clearly believed that designing prophylactic drugs based on what was learned from the human genome would be so easy that by the year 2010 we'd be able to prevent many genetic diseases pharmaceutically. After we easily predicted them.


But in fairness, no one should be strictly held to their predictions (well, except for seers and grant seekers who promise too much) and mostly this hypothetical scenario is interesting as insight into Collins' beliefs about the importance of genes and the power of technology to counteract them -- a set of beliefs that still drives him as director of the NIH, which we have blogged about before. But to us, at least as interesting is something we wrote about last week, the use of modern technology to tell us something we already know. In Collins' hypothetical scenario -- which was completely made up; remember, he could have imagined anything -- John learns from his genetic testing that he was at risk of heart disease and colon cancer. But his family history already told him that! This isn't exactly something that justifies the billions of dollars spent on the HGP!


As director of the National Institutes of Health and past director of the National Human Genome Research Institute, Collins had, and continues to have, tremendous influence over the direction of medical research funding for the last decade, and into the foreseeable future. Indeed, as we've said many times before, his faith-based commitment to improving our health through genetics and technology is taking real money away from real problems that could have real solutions, given equal commitment to solving them. 

Friday, March 19, 2010

Genetic engineering

We often criticize the excess geneticization of diseases whose main cause is not genetic variation but lifestyle factors of various kinds. But some diseases seem clearly to be genetic, with little environmental input, one or only a few clearly known causal genes. In such cases, genetics is the right approach, and genetics of two kinds--to detect risk factors in genetic counseling for parents planning to have children who know a serious allele is in their family, and to treat the disease when it has arisen.

A good example described in last Sunday's New York Times is epidermolysis bullosa. The disease appears to be due to a defect in a collagen gene that produces structural strength and integrity to the skin. The victims have skin described as being as delicate as butterfly wings, and as a result have very compromised lives.

EB seems to be a perfect target for gene therapy--to replace the gene in the germline of parents, or of fertilized eggs in vitro before reimplantation in the mother, or therapeutically to replace the deficient skin cells with cells competent to produce the right type of collagen. Such efforts are described in the article.

Human beings are very good at engineering, and if science is good at anything, it's technology. EB is one of many problems that seem to be engineering rather than conceptual challenges. Science ought to work, in such cases--even if that doesn't mean it can happen overnight. This kind of challenge is where genetic investment should go, in our view.

For complex diseases that are mainly due to environmental or lifestyle factors, if those were ameliorated by social behavior (like better diet) and other measures (like removal of toxins), then for most diseases what would remain would be the truly genetic instances that would fortunately be rare, and fortunately be engineering challenges.

That doesn't mean it'll be easy. If we're good at engineering and have had thousands of years to practice it, if there's one thing that organisms have evolved over countless more thousands, it's to detect and prevent the outside world from attacking its cells. So these will be battles waged mano a mano at the molecular scale.

Many techniques already exist to replace genes, engineer vectors to put genes into cells, or make microorganisms (or culturable cells) produce a gene product. The best approach, perhaps, is to engineer stem cells from the affected person, redifferentiated to be of the needed tissue type, and then somehow introduce them into the affected tissue. There's a lot of progress along these lines, but only time will tell if this is the best approach. Whatever turns out to be the case, at least these are clear-cut problems for which technological solutions seem at least possible.

Tuesday, January 12, 2010

Knowledge is Power

At the dawn of the modern science era, in 1597, Francis Bacon, a founding empiricist, used the phrase 'knowledge is power'. To Bacon, "knowledge itself is power", that is, knowledge of how the world works would lead whoever had it to extract resources and wield power over the world--science would enable Empire.

This view of science has persisted. It was important in the early founding of the Royal Society and other prominent British scientific societies in the 17th and 18th centuries and beyond. The technology and even basic knowledge that was fostered did, indeed, help Britannia to rule the waves.

Basic science was the playground of the classic idle wealthy of the 1700s and surrounding years, and applied technology was developed by people not formally beneficiaries of 'education' as it was done in those times. In the US, major scientific investment, such as in large telescopes, was funded by private philanthropy--of wealthy industrialists who could see the value of applied science.

We tend perhaps to romanticize the 18th and 19th centuries, the era of Newton, Darwin, and many others, who advanced science in all areas--geological, physical, chemical, and biological, without doing so for personal or financial gain. But at the same time, there was much activity in applied science and technology and even in 1660 when the Royal Society was founded with government support, gain was one of the objectives.

An informative series about the history of the Royal Society and of other scientific activities in Britain was aired the week of Jan 4 on BBC Radio 4, on the program called In Our Time--the four parts are now available online. Much of the discussion shows that the interleaving of government funding, geopolitics, and avarice were as important when the Royal Society was funded, as now, in driving science.

There can be no doubt about the importance of systematic investigation of the ways of Nature in transforming society during the industrial revolution. The result was due to a mix of basic and applied science. The good was accompanied by the bad: daily life was made easier and healthier, but episodes of industrialized warfare made it more horrible. On the whole, it has allowed vastly more people to live, and live longer, than ever before. But it's also allowed vastly more people to struggle in poverty, too. (The discovery of novocaine for use by dentists may alone justify the whole enterprise!)

The post-WWII era seemed to foster lots of basic science. But in the US the National Science Foundation and other institutions poured money into science largely, at least, in response to the fears that the Soviet Union whose space program was far ahead of ours, might gain on us in world prominence. So there was a recurring pragmatic drive for supporting science.

The university dependence on research grants was one of the beneficiaries of this drive. We think this has been awful for science, since careers depend on money-generating by faculty, and that leads to safe, short-term thinking, even if more funds mean more opportunity. The intellectually thin Reagan administration's demand that research should translate into corporate opportunity was just a continuation of the materialistic element of support for science.

In a way, we're lucky that basic science, disinterested science actually got done, and lots of it at that! Human society probably can't be expected to put resources into things so abstract as basic science, with no promise or obvious way to lead to better pencils, medicine, or bombs. So it's no wonder that universities, bureaucracies, and scientists alike hype their personal interests in terms of the marvels to be returned to the funders.

Such a system understandably leads to entrenched vested interests who ensure their own cut of the pie. We routinely write about these vested interests and the effect we believe they have on the progress of knowledge. But, as anthropologists, we have to acknowledge that the self-interest that is part of the package is not a surprise. After all, why should we be able to feed off the taxpaying public without at least promising Nirvana in exchange? Human culture is largely about systematized resource access and distribution, and this is how we happen to do that these days.

Under these conditions science may not be as efficient or effective as it might otherwise be. A few MegaResearchers will, naturally, acquire an inordinately large share of the pie. Much waste and trivia will result. The best possible science may not be done.

Nonetheless, it's clear that knowledge does progress. A century hence, it will be our descendants who judge what resulted from our system that was of real value. The chaff in science, as in the arts, sports, or any other area of life, will be identifiable, and will be the majority. But the core of grain will be recognized for its lasting value and impact.

BUT that doesn't mean we should resign ourselves to the way the system works, to its greed, waste, hierarchies, and its numerous drones who use up resources generating incremental advance (at best). That is part of life, but only by the pressure of criticism of its venality and foibles can the System be nudged towards higher likelihoods of real innovation and creativity in knowledge.

It's remarkable that blobs of protoplasm, evolved through molecules of DNA and the like from some primordial molecular soup, understand the universe that produced it as well as we actually do. And we will continue to build on what we know; empirical, method-driven activity is a powerful approach to material gain. Embedded in inequity, vanity, venality, and other human foibles, we nonetheless manage to manipulate our world in remarkable ways.

The history of the Royal Society and other science societies that reflect the growth of society generally, as reflected in these BBC programs, is a fascinating one. But that doesn't change our belief that, in principle at least, we could make better use of our knowledge and abilities to manipulate our world toward less inequity, vanity, venality and so on.

Tuesday, August 11, 2009

Will you buy your genome on a disk?

A story in the New York Times today describes a new DNA sequencing technology that will sequence a whole genome for under $50,000. This is lot closer to the $1000 genome that researchers have been waiting for (and promising) for a long time, as a research tool, but more importantly as an invaluable tool in diagnosis and prediction of disease risk. The grand 1-grand for 1-genotype idea is sure to become a reality sometime soon.

But, even science writers who are outliers on the genetics hyperbole scale are now routinely aware and questioning what we can gain from this information. Essentially, whole genome sequences extend association studies because they are looking for variants in sequence that correspond statistically to variance in phenotypes like disease. The scaled-up GWAS and big biobanks will be the sample on which such work will be done.

As regular readers of this blog know, we (and we're not alone) have been questioning the meaning of 'genes for' thinking for a long time--is this now percolating into the general consciousness even in the media?

Well, there are huge vested interests hiding under the bed. In spite of what looks to be an increasing acceptance of genetic complexity, adherents of 'genes for' thinking are still spending increasing time and money on genome-wide association studies (GWAS), looking for genes for their trait, and still claiming great success, DNA testing companies like deCODEme and 23andMe are still in business, claiming to be able to tell you your risk of disease, and people are still buying these services.

Those who are not so savvy but need to keep their careers on track, and who can do these kinds of studies (because they are largely canned and off-the-shelf nowadays), are sometimes perforce committed to this status quo. But for various reasons that range from true belief in the prospects to fully aware budget-protection are pressing ahead. They need the funding to continue to flow, and hope or believe that whole genome sequences will save the day. Somehow. They don't know how. Pray for serendipity!

It is easy to criticize and harder to change course, especially with so much invested in equipment, equipment manufacture, bureaucratic portfolios, lab personnel, publications, reputations, and tenure. In this sense, we think science is forced to stay the course since we're only human. But that doesn't make it the best science, even if it's technologically leading edge and extremely sophisticated, which it is.

To be a bit more sympathetic, most people are rather conventional and conceptually not very innovative. In science as well as other areas of human endeavor, we want our ideas and even our dogmas: they give continuity to our lives and a sense that we understand things. Change comes hard and new ideas even harder. Though we're all taught, and many teach, that the objective of a scientist to prove his/her ideas are wrong, that's near-total baloney! What is done is almost always contorting to prove that our ideas are right. That's how careers are built. Many journals won't even publish negative results. That's why even in the face of negative results, as in this case, we persist. But that doesn't make it good science.

As to the promises that genomes will predict your life experience....we're not buying it.