Friday, December 7, 2012

Pas ce soir, cherie! A deposit crisis that doesn't involve the banks!

Not tonight, cherie, may be being heard ever more often in the bedrooms of France.  But there's a switch.  If a new hard-hitting BBC story is right, French men, not women, are the halves of the couple having the sexual issues.  Well, maybe it's not about desire, but it certainly affects the end result.  It seems that the sperm count in our Gallic colleagues is considerably lower than it was in the recent past. 
The sperm count of French men fell by a third between 1989 and 2005, a study suggests.
The semen of more than 26,600 French men was tested in the study, reported in the journal Human Reproduction.
The number of millions of spermatozoa per millilitre fell by 32.3%, a rate of about 1.9% a year. And the percentage of normally shaped sperm fell by 33.4%.
The average sperm count remained within the fertile range, but experts want to see more research into possible causes.
Why this is, is a mystery, if the study is indeed not flawed in some as yet unknown way.  Which it may well be -- apparently the method for counting sperm has changed over the years, yielding lower totals, so this could simply be a reflection of that fact.  Or not. 
Prof Richard Sharpe, from the University of Edinburgh, said: "Something in our modern lifestyle, diet or environment like chemical exposure, is causing this.
"We still do not know which are the most important factors, but perhaps the most likely is a combination, a double whammy of changes, such as a high-fat diet combined with increased environmental chemical exposures."
This is a deposit crisis that doesn't involve banks (except, perhaps, sperm banks).  But does it matter?  Probably not.  At least, we know of no evidence that fertility is lower.  Indeed,
Dr Allan Pacey, senior lecturer in andrology at the University of Sheffield, said: "The change in sperm concentration described, 73.6 to 49.9 million per millilitre [on average for a 35-year-old], is still well within the normal range and above the lower threshold of concern used by doctors which is suggestive of male infertility, 15 million per millilitre."
It's just that each sperm cell has a better chance of hitting the jackpot.

Or, perhaps, this is some sort of strong natural selection based on some lifestyle factor that makes its way to netherland.  That could be natural selection without the agony of its normal cruelty to organisms.

Or, perhaps the  missing sperm are miners' canaries:  maybe they tell us that there's something in the water--or wine--that is responsible.  If so, there might some day soon be the ejaculation "Eureka!" as the risk factor is found.

Anyway, this is something hard to get one's hands around, but could be more than a downer for French amour.

Quel domage!

Thursday, December 6, 2012

Fat chance science can tell you whether lard is good for you

Saturated fat is deadly!
Epidemiologists, and nutritionists informed by epidemiologists, have been telling us for decades that animal fat causes heart attacks, and that therefore we should cut down on meat consumption. Fat -- saturated fat -- raises our cholesterol, which hardens our arteries, which is when blood clots get stuck, block the artery and cause an attack. Everyone knows this, right?

Lard, solidified pig fat, has been considered among the worst offenders. (Check out this link for great pictures of lard, lard sculptures, old lard advertisements and more, to really get you in the mood.) But, if you are a carnivore, and you try to eat 'right,' you might be interested to know some are now saying that lard is good for you. This according to chefs, researchers and science reporters on the BBC Radio 4 Food Programme Nov 5 episode.

As the program reports, lard is thought of as "...a great big solid block of white fat" with an "appalling public image," it's a "decadent guilty treat" that we've "been taught to fear." The demonizing of lard has worked, at least in Britain where people ate on average 55 grams of the stuff per week in the 1970's, but only 5 grams now. But the program does its best to strip lard of its bad image and get you eating it again.

But oh so good
Borlengo lardo (flatbread with lardo); Wikipedia
Chefs rave about lard. Jeremy Lee, chef at a prime London restaurant called Quo Vadis and interviewed for the program, loves it both for its cooking qualities and its flavor. His Scottish grandmother used to fry pancakes in lard. He remembers them as beautifully crisp on the outside and fluffy in the middle, and he ate them loaded with butter. And, he loves lardo, lard cured with rosemary and other herbs. He recommends a "grilled piece of sour dough bread, hot, with very thin pieces of lardo on it is one of the most delicious things ever."  It does sound great, doesn't it?

Lard has a much higher smoke point than other oils, so it can be much hotter before it starts to break down, which makes it useful for cooking. Asked whether lard tastes piggy at all, Lee says no, it's "clean and pure." He says it's good for browning meat, potatoes fried in lard are great, as are pastries made with lard and because of its higher melting point it spreads flavor nicely around the mouth.

Here in State  College is a wonderful restaurant, Herwig's, named after its very skilled and entertaining owner.  Herwig's is an Austrian restaurant that can match anything found in the old country itself.  In the spirit of today's post we thought we'd point out that every table at Herwig's has a little menu card with one of the restaurant's mottos:  "Where bacon is an herb."

Here's Calvin Trillin waxing poetic about lard in a story about food in Oaxaca, "Land of the Seven Moles," in the Dec 3 New Yorker. His daughter's family is spending a semester there and he describes paying them a visit. He appreciates food. 
She had even figured out what would likely be my favorite restaurant in Oaxaca--a simple place called La Teca, which specializes in the food of the Isthmus of Tehuantepec, the narrowest part of southern Mexico.  The meal at La Teca began, once a shot glass full of mescal was downed, with spectacular garnachas--masa cups topped with a meat-and-onion mixture.  (A couple of days later, while watching a woman prepare garnachas at a market in Llano Park, I discovered one of the secrets of why they taste so good: after assembling each one, she launched it, like a little boat, on a couple of inches of hot lard, occasionally splashing a bit of lard over the gunwales.
Garnachas del Istmo de Tehuantepec; Wikipedia
 If you happen to want to try rendering your own lard at home, the Food Programme has graciously posted instructions on their website. My sister Jennifer, of Polymeadows Farms, just made her own, in fact, from one of the pigs they raised on the farm this summer and fall. Her pie crusts made with lard are fantastic.

Wet home-rendered lard; Wikimedia

              Jen's mincemeat bars, made from home-rendered lard from her pastured pigs and beef from her pastured cattle.  She says that surely makes it health food. 





Ready to eat.

 Guilty pleasures?

Bah humbug, saturated fat is good for you!
There are people, like Stephanie Seneff, a researcher at MIT interviewed for the program, who now believe that saturated fat is the healthiest fat, and that cholesterol should be a big part of our diet. Indeed, she advises us to eat green vegetables piled with fat because they are much healthier that way.

Seneff's website says she began her research career in electrical engineering, particularly computer conversational systems, but her current interest is in the role of diet in diseases like Alzheimer's and autism. One of her recent papers, for example, is called "Nutrition and Alzheimer's disease: The detrimental role of a high carbohydrate diet." She suggests that excess dietary carbohydrates are responsible for dementia, and, in this paper, offers a possible biochemical explanation for how that might be.

And, Gary Taubes, science journalist and one of the wisest critics of epidemiological methods out there, agrees that fats have been given an undeserved bad name and we've been getting the wrong nutrition advice for decades, as proven by the obesity and type 2 diabetes epidemics. Lard and other animal fats, he says, are 1/2 mono-unsaturated fats, so they lower ldl cholesterol ('bad' cholesterol), and raise hdl cholesterol ('good' cholesterol). And, 90% of the mono-unsaturated fat in lard is oleic acid, which is the same fat in olive oil that nutritionists tell us to eat more of.

Only 40% of the fat in lard is saturated, he says, and saturated fat raises both ldl and hdl cholesterol, so "it balances out," and a third of the saturated fat is stearic acid, which is the same fat as in chocolate, which raises our good cholesterol and does nothing to our bad. So, he concludes, if you look at the bulk of fat in lard, it's, in theory, good for us.

Further, "If you take the lard you're eating and replace it with carbohydrates, like pasta or bread, or something that's supposedly good for us, you'll do your heart disease risk factors more harm than good." That's a strange argument. Who would replace fat with pasta?  In general, wouldn't someone who includes these foods in their diet be eating both?  Indeed, diets are complex, and implicating a single dietary component's role in disease is notoriously difficult to do.

Who's right?
As the program points out, the UK Department of Health website still says, ""You should avoid foods containing saturated fats because these will increase your cholesterol levels." So, should we or shouldn't we? Gary Taubes is a very smart journalist, and very convincing when he talks about why epidemiology gets things so wrong (e.g., his 1995 Science article, "Epidemiology Faces Its Limits"). But we've been disappointed recently to see that he's got a new hobbyhorse -- sugar kills -- and he seems now to pick and choose among results, being a lot more sanguine and far less critical about the very same epidemiological methods when the results go his way.  The list of "heart disease risk factors" is extensive by now, all determined by basically the same methods, and including many contradictions (some studies find that butter is good for you, some that it's bad, some that eggs are good, some that they are bad, and so forth), for the epistemological reasons that Taubes has so well described. So, dare we say, it's easy to pick and choose among them.

And the Seneff et al. paper on Alzheimer's opens with this sentence: "It has been well established that the brain of patients with Alzheimer's disease (AD) is characterized by the build-up of a signature plaque containing an abundance of the protein amyloid-β (Aβ)," and the argument builds from there. The single citation for this is a 2002 paper; in fact, in the past decade this has been shown not to be well-established at all. Brains of people with no symptoms of dementia when they die have been found to have the same plaque build-up, and brains of people with dementia have been found to have none. So, again, not so simple.  So, an argument built on this premise does not build confidence.

Indeed, one begins to suspect that ideology is driving the results these guys have chosen to believe about fats. They may well be correct that fats are good for us!  Or at least for some of us.  But it's hard to impossible to know by weighing the available evidence.  And, as we point out here all the time, complex diseases are complex, and there are often multiple pathways to disease.  People are different, and the same basic diet will make some fat and some sick, while being benign for others.

Lardo; Wikipedia
So, should we eat lard or shouldn't we? Epidemiological methods are particularly weak when it comes to nutrition. Diets are complex, people eat, and thrive on, a wide range of diets. So, rather then rely on current science to tell you whether or not to enjoy that lardo sandwich, we suggest you rely on this 2400 year-old piece of advice instead:

All good things in moderation.

Wednesday, December 5, 2012

Genes for response to head injury?

A story in Monday's New York Times tells of the kinds of long-term trauma that can result from the head injuries football or hockey players so often sustain throughout their careers.  This is a report of the posthumous study of 85 brains, all males, most of whom had been athletes, and most of whom suffered from some form of brain disease.
Of the group of 85 people, 80 percent (68 men) — nearly all of whom played sports — showed evidence of chronic traumatic encephalopathy, or C.T.E., a degenerative and incurable disease whose symptoms can include memory loss, depression and dementia.
The age range of the study group stretched from 17 to 98, which, the study authors report, meant that they could follow the course of the disease as it developed.  Symptoms ranged from mild headache and difficulty concentrating to severe dementia, aggression and difficulty finding words.

While this is a serious topic, and this study a good early effort at determining the effects of head injury, it can't be concluded from this study alone that head injuries cause C.T.E., nor can it be estimated what fraction of athletes or others susceptible to head injury are likely to develop brain disease, because the samples were all from people who had some obvious form of brain disease.  Some people apparently do suffer head trauma with no ill effects.

A prospective study of healthy athletes would be required to estimate the former, and the study would have to include a random sample of individuals who had sustained head injuries regardless of the health of their brain, to get even a ballpark estimate of the latter. That said, this and other studies of the connection between brain trauma and disease do lead to concern about the effects of fairly routine trauma. 

Let's GWAS it!!
However, because response to head injury does seem to vary we can envision that some geneticist somewhere will decide there must be genes that protect against unwanted consequences. The idea, one might fancy seeing, would be to use 'personalized genomic medicine' to identify who should perhaps play patty-cake instead of football, because s/he is vulnerable to C.T.E.s.  Or, the mapping result will send hundreds of otherwise-idea-free human geneticists off to find what gene in the identified chromosome regions makes the brain vulnerable to being shaken around, and to make hundreds of poor transgenic mice with mutations engineered in each of the genes to study the neurobiology.

Gene ontology and other omics communities will seize these seizure results as well.  The identified genes will be added to the repertoire of neural system genes--and their now-assumed neural function will be forced into this category if need be.  They'll be added to what's taught to med students, studies of dissected brains will be done on those who have the culprit variants, etc.

Of course, could it just be that the genetic basis of head trauma isn't related to neural variation at all?  It could be related to muscle strength or firing rate, or to eye-hand coordination, or respiratory factors yielding greater endurance, and so on.  That is, the correlation may have nothing to do with causation, just that variants that get you into high levels of sports will also get you into high levels of having your bell rung.

Or perhaps it's the result of dysfunctional parents, or parents of particular ethnic groups who don't have access to wealth and privilege, and who encourage their kids into sports as a way out of poverty?  Fish around in that community hard enough and you're sure to find some genes.  Maybe even genes for skin color, since African Americans tend to live in more poverty than European Americans, and tend therefore to be more into football.

Then, of course, will be the other set of genes, associated with being white and middle class, and having a soccer mom.

You can see where the assumption of genetic determinism can take one.....

Tuesday, December 4, 2012

The secret of immortality unlocked? No, anyone can do it

The cover story in the Sunday New York Times Magazine by Nathaniel Rich, "Can a Jellyfish Unlock the Secret of Immortality?," tells a fascinating tale.  But an incomplete one, and as usual is over-stated, because interesting as the story genuinely is, the 'secret' has been known for a long time.

German marine-biology student Christian Sommer was studying hydrozoans -- hydra, coral, jellyfish, and so on, most of which live in saltwater -- in the 1980's, when among the hundreds of organisms he collected off the coast of Portofino, Italy was a very small jellyfish called Turritopsis dohrnii, or what has since become known as the immortal jellyfish.  It is tiny, about the size, Rich writes, of a trimmed pinkie fingernail.  Sommer was interested in the reproductive behavior of the animals he collected, and this one interested him particularly because he found it very hard to explain. 
Plainly speaking, it refused to die. It appeared to age in reverse, growing younger and younger until it reached its earliest stage of development, at which point it began its life cycle anew.
Further work on this organism by other biologists resulted in a paper in which:
The scientists described how the species — at any stage of its development — could transform itself back to a polyp, the organism’s earliest stage of life, “thus escaping death and achieving potential immortality.” This finding appeared to debunk the most fundamental law of the natural world — you are born, and then you die. 
Source: A silent invasion, Miglietta and Lessios, 2008
It is now known that during rejuvenation, some cell types transform into others, so that, say, a skin cell can become a nerve cell, though the process by which this happens is not understood. Rich says, it's still unclear how the jellyfish grows young again.  And, of course the idea is that if we understand that, if we could unlock that secret, we could harness it for medicine and change the world.  Indeed, Rich quotes the only scientist currently able to culture Turritopsis in the lab, a Japanese marine biologist named Shin Kubota who Rich visited for the story:
“Turritopsis application for human beings is the most wonderful dream of mankind,” he told me the first time I called him. “Once we determine how the jellyfish rejuvenates itself, we should achieve very great things. My opinion is that we will evolve and become immortal ourselves.”
Some marine biologists believe that this jellyfish isn't the only immortal organism the ocean hosts, but that some sea urchins and sponges and hydra might also be in on the secret.  And, since humans share genetic ancestry with all these forms of life, the excitement for using them to learn more about cancer and aging grows. Though some people are less romantic about this than is Kubota, saying that the idea that we'll learn to be immortal is nonsense, but that the more we understand about the processes, the more likely this might help improve the quality of end-of-life.

Rich watched Kubota mutilate one of the jellyfish specimens he maintains in Petri dishes in his lab, stabbing it numerous times and admonishing it to rejuvenate.  When he's not punching them into rejuvenating, he treats these organisms almost like his children, laboriously feeding the hundreds of specimens he keeps everyday, even taking them with him in a cooler whenever he travels because they are so fragile. 
We checked on the stab victim every day that week to watch its transformation. On the second day, the depleted, gelatinous mess had attached itself to the floor of the petri dish; its tentacles were bent in on themselves. “It’s transdifferentiating,” Kubota said. “Dynamic changes are occurring.” By the fourth day the tentacles were gone, and the organism ceased to resemble a medusa entirely; it looked instead like an amoeba. Kubota called this a “meatball.” By the end of the week, stolons had begun to shoot out of the meatball.
But, they are fragile, fragile enough to be mortal, so their ability to rejuvenate isn't mythic.  What happens is that most of the cells die, but some are able to de-differentiate and begin the formation of a new organism, that is, form a polyp which is the juvenile stage of the jellyfish's life cycle.

This does suggest a sort of 'immortality', though at some point enough mutations will have accumulated that this may not be possible and that individual will die out.  Why the property is there in the first place is something to think about.

However, the excess claims have to do with the fact that essentially all organisms can do the same thing in their own way.  Trees and humans shed single gamete (sperm or egg) cells, dedifferentiated so they can form a whole new individual.  But during your (and a tree's) lifetime, the lineage of cells that will form gametes are differentiated from the original fertilized egg, so they can form the reproductive organs.  Then, specific cells are prepared and enabled to undergo meiosis, to produce sperm or egg.  That is, they dedifferentiate.

In that sense, of course all of us here today are proof that 'immmortality' is nothing unusual at all.  Each of us, each bug, bird, tree, etc. is the descendant of 3.5 billion years of successful reproduction.  We don't dedifferentiate the same way a jellyfish does, but the idea seems to be essentially the same.  Unless we misunderstand that story (always possible, of course!), there is nothing new at all about this--just a different particular mechanism.

Nonetheless, the story is interesting.  It could stand on its own legs without the excess claims.

Monday, December 3, 2012

The empty organism: If not reductionism, what then?

There are some interesting recent parallels in what one may call the philosophy of science, that have to do with the issues related to determinism, reductionism, and our struggle to understand complex trait causation in terms resembling 'laws' of Nature.  We discussed a lead-in to this the other day.

If life is just a kind of fancy molecular biochemistry, and molecules obey fundamental, universal physical laws, then mustn't life also follow the same laws?  If not, what does that mean and what could be the evidence?  How could the laws be suspended?  And at what level would purely material, molecular/energy stop applying?

Answering these kinds of questions is problematic (because we have no actual answers), but nonethless shows an important way in which even asking the questions is not entirely about science but is also profoundly affected by sociocultural and historical circumstances.  These circumstances are complex, but have to do not just with the technology and methods that are available at any given time, but also with what is acceptable to think in the first place.

Gotcha! moments
Darwin and the excitement that his attempt at a universal physics-like theory of life (evolution by natural selection, implying rather strong genetic determinism) was an exciting event in science.  It threatened established scriptural religion, and its proponents felt highly empowered to rip religion based on faith in one kind of scripture for what in many ways amounted to faith in another accepted word--that of Darwin.

Since adaptive evolution of a natural kind (not that done in labs, or in agricultural breeding, or via pesticides and antibiotics, etc.) took place imperceptibly slowly and in the past, we must rely on indirect explanations and interpretations of the evidence.  Ever since Darwin, there have been widespread and rather hubristic declarations of selective stories about this trait or that--or, by some, the belief (and that's the right word for it) that basically everything in life is the result of specific adaptive selection.

More than that, along with striking research success in identifying genes, the historical belief developed that reductionism--ultimately, molecular explanations of everything.  Molecules are the sexy total truth of the world according to that view.  But such a view is not just objective science; instead, it also reflects society at large, as is very clear from the history of the life sciences since Darwin's time.

Gotcha! regains acceptability
History affects what is accepted or followed or believed.  Darwin stole explanations of life away from religion.  Among other things was the idea that what we are is inherited and is here only because it was adaptively successful in the past. This was a view of biological inherency.  As is well known, this immediately spawned the eugenics era.  The idea was that now that we (that is, elite scientists, mainly males) know the real truth of the nature of organisms, we can control rather than be controlled by Nature.  We can guide our evolution with this knowledge.  In a kind of extension or rebirth of the Utopianism of the 18th century, we could purge society of its ills and replace them with only that which is good.  Of course, now we were talking about people, not just social and governmental structures.  This means determining who reproduces and who does not, which is the ultimate value judgement that needed to be made if we were to imitate and speed-up the beneficent goals of Nature.  It might be harsh, if only some reproduce and others don't (or are prevented from it), but Nature itself is harsh, and so on.

This led beyond the rather piously benign idea that someone in authority would decide who could mate, to the less benign idea that someone in authority would decide who could survive.  Over several decades, this idea terminated (so to speak) in the Nazi death camps.  By the end of WWII, the eugenic view that who and what you were was dictated by your genes became so discredtied and distasteful that scientists developed a very different view.

This was the behaviorist or environmentalist view.  In psychology it was lead by BF Skinner in that period, especially in the US.  The idea was that what you are was based on your experience, not your inborn tools.  This was not a new idea, but genetic inherency had taken over as the prevailing view for nearly a century, and the view, whether informal or formal in regard to rejecting eugenics, was the environmentalist view.

In this view, reduction to genetics was not really thought to be of any use.  Whatever the mechanism or how brought about by genes, that (the brain, neurons, etc) internal stuff was just not relevant to understanding the traits--behavior, mainly--of the person.  Reductionism was not going to gain any insights, even if certainly the mechanisms must involve genes and nerves and so on.  It was even said that you could (or should) just assume that an organism was entirely empty inside!  We just need to look at the outside not the inside of our subjects.  We did not need to know anything about the insides to understand behavior, and trying to work out the way the complex wiring worked was a waste of time.  A recognition of the complexity of traits like behavior.  Indeed, for his time, Darwin had little alternative, but this--considering the trait, not the internal generative mechanism--was essentially what he studied in so much detail.

Cachet and cash, eh?
The evidence didn't change, but behavioral approaches and environmental determinism took over.  Indeed, the evidence isn't changing very much even now.  We know a bit about neuroscience that is relevant to behavior, but we're still not really explaining behavior in any serious sense by invoking this gene or that one.  But what is pursued, what people 'believe in' and get dogmatically excited about, and what is allowed or considered acceptable (whether for explicitly understood reasons or not) is changing.  As memory fades, and new practitioners replace the WWII generation, genetic determinism and inborn inherency are rapidly regaining respectability. There is simply too much cachet--and too much cash!--in 'modern' technical science, and too little revulsion at what we know has been done in the name of imposing value judgements by one group against another (using religion, science, or whatever else) as expedient justifications, for this reversal of what is acceptable to swing back the other way.

What we accept is not just based on hard-core science decisions, but to a huge extent depends on historical context as well.  Will it turn sour again?  The probability may be low but is certainly not zero, because elitist expert-based decisions on how society should be run (by them) for its betterment (as they see it) are just hard to keep down.

Of course, we are no more genetically determined, or not, than during Hitler and the prior eugenics times.  So there is no serious scientific reason for this swing back to earlier once-discredited ideas.  But it is now savory to believe in it, as higher-level analysis fails to answer questions (as it did before eugenics) we return to reductionistic inherency.  It just seems technological, real science, and it's lost its odor of abuse.  A new generation reinvents its beneficence for society.  The sexy tools (genome sequencers, fMRI scanners, and much more) are available and so are the grant funds and the journals hungering for The New Discovery (after all, should journalists remember the past any better than scientists do?).

If behavioral  and evolutionary psychology can't keep their hands off this potential societal dynamite, they're not alone by any means.  Genomics and other omics are beating on the same drum, assuming that traits like obesity must be understood not on their own terms but by looking 'inside' the organism to understand them.  Of course, there is and always has been reason for trying to understand how things work.  But there isn't enough understanding, not yet at least, for this to come nearly to what is being promised.

Still, at present, we haven't got good law-like alternatives.  Are there laws of how underlying mechanisms must work, how the determine complex traits, or whether in fact inherently probabilistic things may mean that reductionism simply cannot work very well for the kinds of explanations being sought.

At the very least, more circumspection is what is in order.

Friday, November 30, 2012

Can we or can't we explain common disease?

Rare variants don't explain disease risk
We're still catching up on readings after a long Thanksgiving weekend, so are just getting to last week's Science.  Here's a piece that's of interest -- 'Genetic Influences on Disease Remain Hidden,' Jocelyn Kaiser -- in part because it touches on a subject we often write about here, and in part because it seems to contradict a story getting big press this week, published in this week's Nature.

Kaiser reports from the Human Genetics meetings in San Francisco that finding genes for common disease is proving to be difficult.  GWAS, it turns out, are finding lots of genes with little effect on disease.  This is of course not news, though the Common Variant/Common Disease hypothesis -- the idea that there would be many common alleles that explain common diseases like heart disease and type 2 diabetes -- died far too slowly given what was obvious from the beginning (never with any serious rationale, as some of us had said clearly at the time, we may not-so-humbly add), and the rare variants hypothesis that replaced it is rather inexplicably still gasping.  Or, as Kaiser writes, "...a popular hypothesis in the field—that the general population carries somewhat rare variants that greatly increase or decrease a person's disease risk—is not yet panning out."

Apparently the idea, then, is that there's still hope. Indeed, many geneticists believe that larger samples are the answer.  That is, studies that include tens or hundreds of thousands of individuals, because these will be powerful enough to detect any strong effect rare variants may have on disease, in theory explaining the risk in the center of the graph from the paper, which we reproduce here.  Kaiser cites geneticist Mark McCarthy of the University of Oxford in the United Kingdom: “We're still in the foothills, really. We need larger sample sizes."  Further, he says, "The view that there would be lots of low frequency variants with really big effects does not look to be well supported at the moment." 

Fig from Kaiser. New studies failing to explain the genetics of common disease.  


Even with larger sample sizes, it turns out that some variants are so rare that they're only seen once.  And probably explain only a small proportion of risk anyway, even in that single individual. And certainly can't be used to predict disease. But this doesn't stop geneticists from wanting to increase sample sizes, at this point usually by doing exome sequencing (sequencing all the exons, or protein coding regions) of tens of thousands of people and looking for rare variants with large effects.  Ever hopeful.  McCarthy, a seriously non-disinterested party to any such discussion, is not likely to give up on ever-larger scale operations; that would be research-budget suicide, regardless of the plausibility of the rationales.

Rare variants do explain disease risk
Which brings us to the big news story of the week, a paper in Nature by geneticist Josh Akey et al., described in a News piece by Nidhi Subbaraman in the same journal, 'Past 5,000 years prolific for changes to human genome.'  The idea is that the rapid population growth of the last 5,000 years has resulted in many rare genetic variants, because every generation brings new mutations, and that these are the variants that are most likely to be responsible for disease because they haven't yet been weeded out of the population for being deleterious.

The research group sequenced 15,336 genes from 6,515 European Americans and African Americans and determined the age of the 1,146,401 variants they found.  "The average age across all SNVs was 34,200±900years (±s.d.) in European Americans and 47,600±1,500years in African Americans..."  They estimated that the large majority of the protein-coding, or exonic single nucleotide variants (SNVs) "predicted to be deleterious arose in the past 5,000-10,000 years."  Genes known to be associated with disease had more recent variants than did non-disease genes, and European Americans "had an excess of deleterious variants in essential and Mendelian disease genes compared to African Americans..."

They conclude that their "results better delimit the historical details of human protein-coding variation, show the profound effect of recent human history on the burden of deleterious SNVs segregating in contemporary populations, and provide important practical information that can be used to prioritize variants in disease-gene discovery."  Indeed, the proportion of SNVs in genes associated with Mendelian disorders, complex diseases and "essential genes" (those for which mouse knockouts are associated with sterility or death) that were 50,000 to 100,000 years old was higher in European Americans than in African Americans.  The authors propose that this is because these variants are associated with the Out-of-Africa bottleneck as humans migrated into the Middle East and Europe, which "led to less efficient purging of weakly deleterious alleles."

The researchers conclude:
In summary, the spectrum of protein-coding variation is considerably different today compared to what existed as recently as 200 to 400 generations ago. Of the putatively deleterious protein-coding SNVs, 86.4% arose in the last 5,000 to 10,000years, and they are enriched for mutations of large effect as selection has not had sufficient time to purge them from the population. Thus, it seems likely that rare variants have an important role in heritable phenotypic variation, disease susceptibility and adverse drug responses. In principle, our results provide a framework for developing new methods to prioritize potential disease-causing variants in gene-mapping studies.  More generally, the recent dramatic increase in human population size, resulting in a deluge of rare functionally important variation, has important implications for understanding and predicting current and future patterns of human disease and evolution. For example, the increased mutational capacity of recent human populations has led to a larger burden of Mendelian disorders, increased the allelic and genetic heterogeneity of traits, and may have created a new repository of recently arisen advantageous alleles that adaptive evolution will act upon in subsequent generations.
This does seem to contradict the Kaiser piece we mention above, which concludes that rare variants with large effect will not turn out to explain much common disease.  This paper suggests they will -- which we don't think is right, for reasons we write about all the time.  But it does lend support to the idea that the Common Variant/Common Disease hypothesis is dead and buried. 

Serious questions
It is curious, and serious if true, that Africans harbor fewer rare variants than Eurasians.  African populations expanded rapidly since agriculture, just as Eurasians did.  It could be, but seems like rather post-hoc rationalizing, that Africa is more dangerous to live in, even for only mildly harmful variants.  Rapid expansion--the human gene lineages have expanded a million-fold in the last 10,000 years, will lead to many slightly harmful variants being around at low frequency, because slight effects aren't purged by selection as fast as they are generated in an expanding population.

In a sense the deluge has not been of functionally important but rather functionally minimal variants.  Maybe there is something about the raised probability that a person will have a combination of such variants, and the variants could be found by massive samples.  But then their individual effect probably isn't worth the cost of finding them, as a rule.

But where's the nod to complexity?
But, environments change, and genes now considered to be deleterious may not have been so in previous environments, or may even have been beneficial.  And African Americans don't represent a random sample from the entire African continent, as their ancestry is predominantly West African, and SNV patterns are likely to be different in different parts of Africa.  And, numerous studies have found that healthy people carry multiple 'deleterious' alleles, so the idea that 84% of SNVs will lead to disease is probably greatly exaggerated. Geneticists just can't bring themselves to acknowledge that complexity trivializes most individual genetic effects.

The more likely explanation for complex disease continues to be, "It's complex."