Wednesday, January 8, 2014

The "New World Syndrome" genetic basis: has it been found?

In several papers in the early 1980s, I was working with colleagues Ranajit Chakraborty and Bob Ferrell and others, on disease problems that we had noticed in Mexican Americans.  Among these were a particular distribution of body fat (a characteristic obesity pattern), gallstone disease and type 2 (at the time, adult onset) diabetes.

In that work we also noticed a correlation between risk of these problems in Mexican Americans and a pandemic of the same in Amerindians in North America (data from Central and South America were less and less clearly interpretable).  We developed a hypothesis that these various conditions, and perhaps other consequences of them, were not only due to vulnerable genotypes but that the genotypes may have arisen in the ancestors of Amerindians.  Because various traits were variously associated in the available data, our idea was that this was a syndrome, or collection of physiologically associated traits, that had a New World origin.  Each person might have a different set of the associated conditions.

Because the problem seemed continental in scope, we thought this may be due not just to social conditions or poverty, but to specific susceptible genotypes.  In a 1984 paper, we called this a 'New World syndrome' (NWS) of conditions.

There was little in the way of actual genes that could be identified at the time of this work, and we used mainly epidemiological data on prevalence and age of onset, and on family correlations.  But a few genetic markers (variable genes identified by blood-typing and protein analysis) were available and we could estimate the European and Amerindian fractions of ancestry in the Texas Mexican-Americans that were the main object of our particular studies.

Even in the absence of actual genotypes in detail, and certainly not any specifically related to the biology of these traits, we could demonstrate associations between Native American admixture and  prevalence in a way that was consistent with a single-gene effect.  Even if traits like diabetes are usually genomically complex, the severity, early onset, high prevalence and syndromic nature of the NWS, and continental dispersion of a trait in such a recently settled part of the world, made it seem plausible that the inherited effect was in some key enzyme or pathway--that is, might be genetically simple. 

The restriction of this particular set of traits to Native Americans and those admixed with them, and their apparent absence (that is, as a syndrome or at the same level of risk), and the admixture association, suggested to us that the genetic variant(s) responsible had come over into the New World through the initial Bering land bridge route, at substantially high frequency.  That is why the NWS is specific to the New World, in our hypothesis.

Bering Land Bridge, 21,000 years BP to present; Wikimedia Commons

The historic evidence from 19th century and earlier anthropological and traveler reports showed that the pandemic of diabetes and associated obesity was not the aboriginal state.  The modern disease traits had arisen in the last 50 years, so clearly seemed a response to modern living conditions, basically of post WWII origin, and applying more to those living a poorer existence compared to those of us in suburbia. Since Mexican Americans often worked as farm laborers, certainly not deprived of exercise, and because traits like gallstones appeared frequently in pubescent girls (almost unheard of in the 'Anglo' or African American populations), the syndrome seemed to be a kind of response to something dietary that  had changed--we couldn't identify it, despite some guesses.

In any case, it seemed genetic in the response-to-environment sense.  And if there were an evolutionary antecedent it might have been for some genotype that was good at storing food energy as body fat, during the settlement of the Americas in the cold and relatively barren Siberia-to-Alaska passage.

We published our hypothesis in various places at the time.  The papers were seen by a lot of people, but we moved at that time from our positions in Texas, up north and far enough away people doing the work that it was hard to keep on.  So we stopped working seriously on the problem.  There was interest from colleagues in various places in Texas and the Southwest but for various reasons we never could get a proper, large study going, one in which the syndrome itself was studied rather than the traits independently of each other, and with a clear admixture-based approach.

As a result, nothing has been found to date.  But perhaps a new report will provide that lead, even if the authors are unaware of the previous work and our long-standing hypothesis.   The paper, published online in Nature by a consortium of many investigators on December 25, claims that variants in the SLC16A11 gene may be responsible for adult diabetes in Mexico.  Ironically, if not perhaps even suspiciously, this is based on fossil data!  Could such a stretch be the answer to the NWS?

The new paper and SLC16A11
The huge SIGMA consortium responsible for this study examined various genotypes that might account for the high prevalence of diabetes in Mexico.  After a lot of data description, the bottom line is that there is evidence of known genes' contributing to diabetes in the Mexican data set, but it doesn't account for all of it.  Searching various data bases led to a consideration of the available Neanderthal sequences.  One east-Eurasian (Denisovan) specimen is homozygous for the candidate variant in SLC16A11, and it has high frequency in the Mexican study population and in some east Asians.

While the current study only tested this for diabetes risk, which was statistically substantial, they did not test other aspects of the hypothesis, not just that the susceptibility is of Native American origin, but that it involves other traits than diabetes.  Of course some diabetes in Mexico will be more or less strictly environmental and much due to European-derived alleles, because a large segment of the Mexican population is a European-Amerindian admixture.

Right now there are too many unanswered questions to be able to tell from this study whether the NWS idea is correct, or whether this newly discovered risk allele is responsible.  But the necessary data are available if investigators care to look at it in this way.  (We have contacted one of them to see if there would be interest).

If the story turns out to be true, it will not only have potential major health importance for a huge multi-continental population (because, if genetically rather simple, perhaps targeted therapy could be developed), but will tie evolutionary and biomedical genomics in a nice package.  We are eager to see.

Tuesday, January 7, 2014

The Dirt on Ancient Civilization: Lost Soils and Lessons Lost

     Recently, the Mermaid’s Tale hosted a lively discussion about the impact of agriculture on society, from its Neolithic beginnings until the present day. I personally see the move from hunting and gathering to farming not unlike a visit to the dentist: painful but necessary. Painful because, too often, its come at environmental cost, but necessary because the hallmarks of society (art, modern medicine, the Yankees) could never have been realized without the social organization that farming demands. Still, there was consensus that not all farming is the same, that certain agricultural practices are more sustainable than others, and that the repeat offender, human shortsightedness, is not inherent to agriculture, but to people (and here we can agree with the prophet of Ecclesiastes who said there is nothing new under the sun). But that doesn't discount agriculture's unique expression of this all too human flaw. Perhaps today's greatest threat in this regard, provoked under current agricultural regimes, is topsoil degradation and loss. The problem, exacerbated by global markets and soaring demographics, has blown by any measure of sustainability, with annual rates of worldwide topsoil loss at 24 billion tons. With business remaining as usual, the planet will be stripped of all topsoil within a century's time. If you value humanity, you should be outraged.
      As a student of archaeology, I thought a deep-time perspective on soil loss might help. After all, both farmers and archaeologists value dirt, the one for what lies above it, the other for what lies below; and besides, soil and civilization have long shared an intimate relationship, each dependent on the other for health and survival. Healthy soils are home to billions and billions of microbes. When stripped of plant cover and sapped of life, the remaining dirt can no longer support crops. In order for meaningful soil conservation to take effect, an individual's sense of responsibility has to extend beyond physical space (be it local or global, though that's an important part, too) into historical space, as well. Only then can we regard agriculture for what it was, is, and must remain: a social institution to be valuated beyond dollars and cents.
     Moreover, a recent article in Science, "Dust Unto Dust," made the oft repeated distortion that ancient farmers had little regard for environmental stewardship. This simply isn't true. My six mentally stimulating but fiscally foolish years spent stooped over Greek and Latin texts in fusty libraries with fools in old style hats and coats will prove it. The Greeks and Romans, just like the early settlers of America, saw the harm in their practices, but for various and sundry reasons failed to change course, at times resulting in social catastrophe (for the U.S., this was epitomized in the North American "dust bowl" of the 1930s; for the Greeks and Romans, the abandonment of numerous cities and towns, such as Timgad in present day Algeria). The age-old struggle with soil preservation does not represent nature's inherent inability to accommodate civilization, but civilization's inability to accommodate nature.


http://www.travels.tl/travel-in-time-among-the-roman-ruins-of-timgad-algeria/
The Roman colonial town of Timgad, once famous for its fertile hinterland, had even in antiquity been reduced to desert 

    Topsoil is quite literally the foundation of every society - footings and stomachs depend on it. Perhaps it's not coincidence that the name Adam, related to the noun adamah in Hebrew, means "soil" or "earth" (from the Semitic root "ADM"), while the name Eve, Adam's female counterpart, is related to the noun havvah (the Semitic root "HYW"), meaning "life-source," with the result that the biblical creation narrative links soil to life (and vice versa) from the beginnings of time. In the Babylonian creation myth of Atrahasis, man is fashioned from the alluvial clays of southern Iraq. In fact, the Latin word for man, homo, is a derivation of the word humus, meaning "soil" and "earth." Fun etymologies aside, when topsoil is exhausted for short-term gain, decay and loss unleash a cascade of direct and indirect consequences for farmlands, forests, wetlands and watersheds, many of long-lasting effect.
   
http://upload.wikimedia.org/wikipedia/commons/b/ba/Henry_David_Thoreau.jpg

     Henry David Thoreau once said that

The civilized nations — Greece, Rome, England — have been sustained by the primitive forests which anciently rotted where they stand. They survive as long as the soil is not exhausted. Alas for human culture! Little is to be expected of a nation, when the vegetable mould is exhausted, and it is compelled to make manure of the bones of its fathers.

Unfortunately, the automaticity of our everyday experience lacks the immediacy of Thoreau's words. Even at today's rate, soil loss occurs slow enough to go unnoticed by the average viewer. It doesn't help that less than 1% of the U.S. population, for instance, feeds the other 99%, meaning that our eyes are not only metaphorically but also physically averted. In his excellent novel, Dirt: The Erosion of Civilizations, David R. Montgomery points out that soil is an undervalued resource in the modern world. In fact, he goes so far as to say that within dirt (or more specifically, its geological narrative of misuse) can be found the graveyard of successive civilizations.

Photo taken by the author in Oxford, UK

     Agricultural harm to the environment has a long and complicated history. It wasn't until the 7th and 8th centuries A.D. in northern Europe that plows began to employ a vertical knife, horizontal share and moldboard to cut furrows, directly scarring the earth's surface. Prior to this, enough indirect damage was caused, through violation of fallow, disregard for overgrazing and/or lack of crop diversity, that eventually, divested of nutrients and biota, soil was no longer fertile soil, but simply sterile dirt. For the Mediterranean environment especially, a healthy topsoil required forests to preserve soil structure, preventing erosion and flooding. Unfortunately for the Greeks and Romans, deforestation became widespread, the result of resource extraction and clearance for agriculture.
     Forests supplied the very fabric of life in the Classical world: materials for construction and fuel, as well as medicinals and dyes. Wood and charcoal fired ceramics to be found in every house; drew pitch from pinewood; smelted the metals in foundries; and leached fertilizer from limestone. For most private houses and public buildings, timber was the material of choice. Navies depended on lumber for the construction of their vessels, while armies used wood to build siege engines, weapons, and armor. It's no wonder that the Greek word for wood, "hyle", became synonymous with "matter" and "substance." The word "materia" in Latin represented all three.
     While populations were relatively small, daily existence depended on wood and its by-products, laying whole landscapes bare. Presumably, many could sympathize with Vergil (70 - 19 B.C.) when he said,

My hearth is piled with branches of pitch-pine;
Free burns my faithful fire, and every hour
My walls are black with smoke (Ecologues 7.49-50).

Moreover, cleared forests weren't given time to regenerate, but were replaced with agricultural land. Ancient writers were aware of the environmental consequences. Vitruvius (first century B.C.), Roman author and engineer, knew well the role of forests in supporting the flow of water for natural springs:

Water ... is to be most sought in mountains and northern regions, because in these parts it is found of sweeter quality, more wholesome and abundant. For such places are turned away from the sun's course, and in these especially are many forest trees; ... nor do the sun's rays reach the earth directly and cause the moisture to evaporate. Valleys between mountains are subject to much rain, and because of the dense forests, snow stands there much longer under the shadow of the trees and the hills. Then it melts and percolates through the interstices of the earth and so reaches the lowest spurs of the mountains, from which the product of the spring flows and bursts forth (De Architectura 8.1.6-7).

And Plato (427-347 B.C.) lamented the state of the denuded landscape following torrential rains:

What now remains compared with what then existed is like the skeleton of a sick man, all the fat and soft earth having wasted away, and only the bare framework of the land being left (Critias 111B).

Over time, intensive farming and deforestation led to soil loss in upland valleys, with resulting sedimentation and siltation in lowlands and coastal plains. The consequences were often complex, including reduced water retention for underground aquifers, the formation of new disease vectors (such as malarial swamps and marshes), increased aridity and wind damage, as well as the marginalization of agriculturally (farm) and socially (urban) productive land, creating dependencies of distance that only heightened ecological burdens.
     In Greece and Rome, geo-archaeology has shown that meaningful erosion only followed settlement and farming. After certain environmental thresholds were crossed, great periods of seeming stability gave way to outright destabilization. Archaeological investigations near Rome have shown that erosional rates spiked during the second century B.C., a time of the Gracchan reforms that extended land clearance. Before agricultural reform, erosion rates averaged from 2-3 centimeters per every thousand years; following reform, the rate climbed to 20-40 centimeters. Similar phenomena have been seen in the archaeological record of Greece. Thermopylae (the "hot gates" of Frank Miller's 300 movie), famous for the battle that took place there in 480 B.C., saw a vastly outnumbered Greek force successfully repel an advancing Persian army, precisely because the narrow coastal strip, hemmed in by cliffs and sea, provided a tactical advantage. Today, years of accumulating river silt have left the shoreline some 5 miles from the site of the ancient confrontation. Unfortunately for Xerxes, he was two millennia too early.

http://a5.mzstatic.com/us/r30/Video/cd/78/71/mzl.ltdunkrj.jpg

Again, the textual sources show a keen awareness of topsoil degradation, even though that awareness often failed to bring about systematic change. Columella (4 - 70 A.D.), for instance, noted in his De Re Rustica that, following the clearance of forest for agricultural land, crop production waned in successive years, not because the cleared lands were "young," but because the soil, now deprived of the roots and foliage of woodland plants, was malnourished (2.1.5-6).
   The expansion of agriculture was a prime contributor to erosion. Lucretius said that entrepreneurial farmers "made the woods climb higher up the mountains, yielding the lowlands to be tilled and tended (De Rerum Natura 5.1370-71)." Over time, this forced both farmers and loggers into hillier land that, when exploited, posed an ever greater risk to topsoil runoff and flooding. In the Mediterranean basin, timing and intensity of rainfall mattered far more than annual totals.  Following deforestation, the mountainous landscape was especially vulnerable to sudden, violent rains. Unprotected first by the previous forest cover and then destabilized without the natural root system, soil didn't have a fighting chance. Pliny wrote, "often indeed devastating torrents unite when from hills has been cut away the wood that used to hold the rains and absorb them (HN 31.30 (53)." 
     In addition to land clearance for farming, animals that wandered the land posed an even greater danger. Varro (116-27 B.C.) wrote, "Grazing cattle do not produce what grows on the land, but tear it off with their teeth (De Re Rustica 2.2.8, 11-12)." Foraging animals (particularly goats on hill slopes) functioned as a sort of secondary threat, making whatever degradation came before, permanent, ensuring that ecological niches were never given adequate time to regenerate. Ultimately, soil loss required the complicity of entire ancient societies: merchants, farmers, and pastoralists.
    For some, it's interesting in its own right to know that the Greeks and Romans knew how to use natural fertilizer, to practice crop rotation, and to construct terraces, that is, knew how to treat the environment responsibly. But it's perhaps universally important to know that, despite that knowledge, they often failed to do so, with topsoil loss leading to famine, disease, and fragile social structures. Today, we as a global community are facing a similar crisis, together with a similar knowledge base. But we have the unique advantage of historical perspective. My current area of study, Mesopotamia, experienced social collapse because of over salinization from the maximizing strategies of short-sighted agriculture, millennia before men wore togas. Mesopotamian tyrants, like modern politicians, were consciously aware of their short shelf-life, with the result that policies affecting a future beyond their own terms in office were little more than afterthoughts. Today, if we fail to adapt, to ask ourselves and our politicians for long-term solutions, we can expect the same outcome. To think otherwise is fatal hubris, the prime mover of historical change seen in the works of men like Herodotus, Thucydides and Polybius.
     Topsoil loss is inevitable in both the agricultural and natural world, but that doesn't mean we can't treat it sustainably. In past and present, local production for global consumption has had devastating effect. Whether it was Rome's monopolization of North Africa for cash crop industries, or America's impetuous push west to sustain tobacco exports, formerly fertile lands were impoverished in the name of capital. When this happened, a type of economic colonialism occurred, dividing formerly sustainable lands into decadent cores and imbalanced peripheries. The difference now, however, is that societies in the plural is now society in the singular, and the loss of mankind's ability to produce food will be of global consequence.
    In the modern regimes of soaring populations and unpredictable climate, crop rotation and diversification are critical, as well as the move to lower-input, no-till farming. In addition, governments should stop subsidizing harmful agricultural practices, and consumers have to adapt to appropriate-scale markets. Only then will soil increase in biodiversity and organic content, bringing with it the much needed benefit of carbon sequestration. Most importantly, individuals need a lived sense (past, present and future) of social responsibility. And we would be wise to keep in mind Faulkner's famous line: "The past is never dead. It's not even past." Indeed, I would say it's right under our feet.

Monday, January 6, 2014

The problems with genomic prediction. Need it be stated again?

We often write about the challenges to prediction of phenotypes, especially disease, from DNA data.  If the trait is clearly due to variation at a single gene, then prediction can be useful.  If most instances are due to variation at a single gene, but not always the same gene, DNA data can be useful as well.  Of course if the variation is in regulatory or other DNA, rather than protein coding parts (genes proper), unless the whole genome is sequenced (not just the 'exomes'), things may be much more unclear.

Most common disease or behavioral traits aren't like that, however.  As we have piles of data to show, traits like cancer, stroke, dementia, and heart disease are typically due to very many but individually minor, contributing parts of the genome, plus, oh yes!, maybe some Environmental factors that, if we're geneticists, we don't really want to have to think much about)  But the big E is the key to trouble, in ways reflected by two stories in yesterday's NY Times Sunday Review section.

The cancer example
One story suggests what is presented as a surprise, that cancer (despite the 40-year 'war' against it by NIH) is rising in lifetime risk, not falling, or at least is going to affect more rather than fewer people in the future.  But, despite the fact that the 'war' on cancer was a bad way to waste a lot of money (good proposals would have been funded without the attraction of a poorly disciplined pot of gold), even if cancer treatments are better (sometimes true, at least), more victims will fall to cancer.  This seems counter-intuitive, so why?

The answer is something well-known to those who know, but poorly explained to the public.   We've written about it before here on MT.  It's called 'competing causes'. As long as we all have to die of something, and the older we are the more damage our various cells sustain, if you remove one cause, those who are spared that then stay alive to be vulnerable to the remaining causes.  Heart diseases can strike relatively early, but cancer, which in many if not most instances is due to mutational and other cellular damage, is a risk that keeps on growing as cells continue to be exposed to mutagens and the like.

This is no new discovery or surprise and the public and all geneticists should by now be well aware of it.  Also, earlier treatment that is developed for other diseases will have similar negative consequences (though of course to those who were saved from earlier death).  The overall health care cost burden is likely the result of research that is successful against earlier-age killers.

Other diseases are part of the Environment of cells unaffected by those diseases. Spared of those, you are exposed to continued mutagens throughout your life.  Yet these changes cannot be predicted, and as a result, genotype-based risk estimates (that aspect of 'personalized genomic medicine') are a fantasy: they may or may not be right and are of unknown--and unknowable--accuracy.

Socioeconomic advances
The other Times article yesterday that caught our attention was about the increased risk of various diseases to those who work their way out of the lower socioeconomic strata in our society.  As they rise in education, wealth and so on, and perhaps because of the stress required to dig oneself out of poverty, people suffer risks of various diseases that are higher than they would have experienced or of their peers in their newly achieved SES group.

Again this is the mixed blessing of changing lifestyles.  It is the again-unpredictable effect of Environmental change.  It's unpredictable even if we knew the specifics of a given sample of cases, because they are always retrospectively analyzed, after things have already happened to that sample, but we can't know how things will change in regard to SES-specific risks or in SES-mobility patterns in the future, the future of those who want to know their risks.  You cannot tell these things from the person's genotype, of course, and social science is even farther behind genomics when it comes to having a crystal ball for the future.  Indeed, all the Big Data resources in the world will have this problem, and to that extent will be a huge waste of public health funding.

These are not new major discoveries of new principles.  They just happened to be in the paper and to reflect the issues.  They just sound the warning bells that should be being heard, and heeded, in the labs and offices of NIH and medical schools, and the companies who have been giving what is essentially misleading (and inconsistent) DNA-based health advice, the whole community making rather careless promises of genome or other 'omic' based miracles.  These are facts of life, so to speak, that in themselves should be the subject of investigation. But that's a real, legitimate challenge, unlike the weak challenge of proposing to collect more and more data without being obligated to deliver on promises made to justify that or to address the real problems in an effective way.

Saturday, January 4, 2014

I'm with the GOP. I don't believe in evolution if that's what it is.

Yesterday I riffed a little on stories we tell in the evolutionary sciences. I'm compelled to share a footnote to those thoughts with a rare weekend post.

You don't even have to know a story very well to claim it as your own or to prefer it over others. This is usually where people like to point out that so many creationists know little about the Bible. Stuff like that.

But, actually, a great example of a story that people prefer and even defend yet don't always understand is evolution.

How many people do you think are reading this NPR piece (and many others) about the recent Pew survey and are feeling pretty superior to the evolution deniers?

I'd say a good many.

Yet, how many of those enlightened evolution fans do you think noticed the ridiculous figure and caption "showing the evolution of humans" that went with the piece?

I'd say fewer than a good many.

"Culture Wars"...they're just the worst. For more on this, see my Evolution P.S.A. today...
source

Friday, January 3, 2014

Story required.

It might seem like it this week, but sex isn't the only recurring theme here on the MT. From various angles, we hit on the culture of science quite regularly, quite relentlessly, and quite hard. We can't help it, we think anthropologically. We want to know the truth of things as much as anyone else, and we think discussing and revealing what we don't know and what obstacles we face are important steps in the process toward knowing. 

One of the ways that science is so clearly cultural is its love of stories. 

I even did the story thing just then. I made science into an actor. 

Try again:  Science is done by humans and humans love stories. 


source

Hardly anybody I know'd deny that.

But what we scientists (particularly evolutionary scientists) seem to resist like the dickens is that we require stories and we are required to fit our work into others' or to write new ones. 

These superstitions can't be the straightest path toward the Truth, can they?

Back up a sec. First, I'm not trying to dump on the power of analogy. Without analogy we couldn't have gotten this far, scientifically. Lacking analogous thinking is a big reason why chimps don't reason. 

And, second, I'm not trying to dump on anthropomorphism or personification because I've done that already recently (and I'd love to talk about something else today).

I'm talking about making our research, our methods, our findings, our results fit our desired narratives. Or any narrative for that matter.

One of these habits we often discuss here at the MT is selectionism which is closely related to adaptationism.

Ken's recent post on this is brilliant: 'Every trait is due to natural selection!'... often said but is it true?

And since reading his piece (and since before) I've been stewing about some related matters, like, why don't negative results get published? 

I'm of the frame of mind to spin the following: Because the story arc where there is no change (no arc?) isn't usually the one that sells the screenplay to Paramount, and likewise, isn't usually the one selling Nature ads and subscriptions.

There's a pretty big recent exception to the negative attitude toward negative results. When they didn't find dark matter, we heard all about it! 

But that's because it was the first exploration of its kind and the spin was that some big time physics equations were wrong and needed to be scratched and reconceptualized. How exciting and productive! And wtf is dark matter?! 

But when people find no significant p-values for any effect of a food or drug on some aspect of health, who cares right? No change. No cause and effect to bring no change about! No results! which is not true, but still... Boring!

In fact, "boring" is what a reviewer called the last paper I tried to publish in a relatively high impact anthropology journal. It was because I didn't push one hypothesis over the others and instead discussed how unfalsifiable and untestable some present anthropological explanations are. The hypothesis I was expected to push--and it was punishably confusing why I didn't--was the story I'd written not too long ago.  Well, because I had already rewritten this entire manuscript since initially submitting it and because I don't think I could have gotten it through a second round of revisions without insincerely and unscientifically pushing one idea over others (which aren't even falsifiable to my mind), I withdrew the paper and will try somewhere else. All I had to do was convince the reader to join me in favoring at least one clever story and I dropped the ball entirely. I flopped because I didn't even try. The story, I thought, was that there might not be a story! If Charlie Kaufman had co-signed my paper, maybe it'd have had a chance? It might have no chance in anthropology journals, but I haven't given up yet. 

Try to find, let alone publish, an evolutionary paper without a story, without circumscribed causes and real or apparent effects. I haven't tried that hard, but I haven't succeeded yet either. It's probably much more common for people to attempt to publish stories but to have those rejected as the wrong stories, the ones not preferred by reviewers, or pushed by reviewers with vested interest in the 'correct' stories. 

Even my little flipbook classroom exercise, which simulates genetic drift, got rejected for publication partly because they feared it would tell the wrong story: intelligent design.

What's hard to swallow is, we can't really know the real natural history in all its glory, and so there's really nothing preventing us from writing natural history the way we want to (within bounds, whatever those may be). And so why's that enough for so many scientists? Why does that suffice? Maybe doing natural history is more like doing history than I ever thought. You get the details correct and you can write the story, the agents, the causes and effects based on your own interpretation and arrangement of those details. And as long as people like your story, you're good, you might even be golden. 

O! What if I'm just an anal retentive weirdo taking it all too literally? 

I didn't bait you here to read me whine and gasp existentially. I actually had more interesting thoughts about the bigger picture to share today. 

For instance, when you see so many potentially real but unfalsifiable evolutionary hypotheses as the stories that they are, it makes it so awkward to watch when science-minded folks spew venom at the "ignorant fantasy stories" of creationists.  

For more in this vein, or related to it, even remotely, I leave you this afternoon with some recent stories about stories, some favorites, others just plain interesting or relevant:

There's no Santa Claus, There's no Easter Bunny and there's no Queen of England! by Joel Adamson
If you want to know why I love this piece, see the comments.

Are hobbits human? Textual and genetic analysis of our closest real and magical relatives by Matthew Yglesias
~This is great, but took flak from both sides: scientists for the paleo and nerds for the Tolkien.

...and timely follow-up to that...
Myths matter from Maria Popova 

...and a response to Slate's story about a story...
Slate's embarrassing Middle Earth error by Max Read

Standing up for sex by Henry Gee
From the piece: "Now, I advance the above more than half in jest. It’s possibly no better or worse than any other idea, but I’m not going to pin anyone against a wall and shout about it. "
I have a hunch I'll like his book but I'm head-cocking over the attitude given what Nature publishes.

Public's views on human evolution by Pew Research Center

Surprising number of Americans don't believe in evolution by Jaweed Kaleem 
I don't believe in evolution either if it always ends in a white dude like the crappy figure they used here.

I had my DNA picture taken, with varying results by Kira Peikoff

Does reading actually change the brain? by Carol Clark-Emory

Claims of 'virgin births' in U.S. highlight pitfalls of self-reported data by Sharon Begley

In saving a species you might accidentally doom it by Ed Yong
Knowing the story of natural selection saved these birds from the humans who started out by ignoring it.

We need to talk about TED by Benjamin Bratton
From the piece, "If we really want transformation, we have to slog through the hard stuff (history, economics, philosophy, art, ambiguities, contradictions). Bracketing it off to the side to focus just on technology, or just on innovation, actually prevents transformation." 
So many parallels with what admins and students expect of profs.

And finally...

Editing your life's stories can create happier endings by Lulu Miller
~ THIS IS WONDERFUL.

Thursday, January 2, 2014

Walk this way, talk this way, roll in the hay

Teaching anthropology and human evolution involves tearing down stubborn misconceptions and stimulating students to discover and to behold their culturally-limited assumptions objectively.

That's if you're skilled and if you're lucky. OK, let's be honest: that's if you're supernatural.

The job sometimes feels like digging a hole, going deeper and deeper, never having the chance to mold something out of all that dirt, to build upon existing knowledge and insights. To move upward and onward.

Enough preamble though. There's a point today and it's got to do with:

The ever-annoying, but ever-so educationally priceless enigma that is... The Neanderthals [appropriate sound effect... and more].

I just finished a semester of Paleoanthropology where my students were asked to answer, "What happened to the Neanderthals?" for their course-long and final projects.

Even a familiarity with Neanderthals from their Intro to Bioanth or their Intro to Archaeology course does not fully prepare all upper-level Paleoanthropology students to consider them in a more advanced scientific framework. In fact, I think that familiarity combined with the claws of pop culture can inhibit them.

Despite hosting an expert to present to my students the many obstacles and issues with identifying extinction and its causes in the fossil record, and despite an admittedly brief but explicit exposure to the cutting edge genetic evidence, some of them still assumed they were charged to find the cause of Neanderthal extinction.

Two of them went so far as to rewrite my question at the top of their final paper as, "Why did the Neanderthals go extinct?" They had no idea that their version of my question contained assumptions. It's got to have a lot to do with the fact that we kicked off the first week of class and their assignment by binge-reading "The Humans Who Went Extinct" by Finlayson. The book does a wonderfully broad treatment of the issues, but I was completely blind to the title's potential to inhibit nuance. If I'd anticipated this I would have discussed extinction much more during the course. [Consider this post, as so many are, an elaborate note to self.]

The trouble, as I see it, is it's unclear whether the Neanderthals went extinct the same way that we consider the Dodo to have gone extinct or the same way that dinosaurs (except birds) did at the end of the Cretaceous, etc.

Of course there aren't any Neanderthals alive now. But there aren't any australopiths alive now either and nobody's talking about australopith extinction.  Australopiths begat or, if you'd rather, evolved into HomoArdipithecus didn't go extinct either.  As of now we think and say that they evolved into (or, e.g., are in an ancestor-descendant relationship with) Australopithecus.

Aside from Neanderthals, Paranthropus is probably the only other hominin taxon that we discuss in terms of extinction. If its phylogenetic position is correct (and there is no dispute that I know of beyond the debate over one or two genera), then it left no living descendants and faded from the fossil record about a million years ago during a time when many other sub-Saharan fauna disappeared too. But these creatures were weird little bipedal apes, not stocky and muscular, big-eyed, big-nosed, ginger-haired, complexly cultured Europeans as the Neanderthals seem to have been. It's obvious to me why we obsess over the demise of the latter and not the former.

Anyway. Point is. I think we're being a bit intellectually reckless assuming Neanderthal "extinction." To me the question of their fate is more fairly posed "What happened to them?" with a strong answer being extinction but with a kind of extinction that needs to be carefully defined.

In order to hold the Neanderthal demise apart as special, as an exclusive story of "extinction," it needs to be shown that other long-dead LSA/UP hominins that we don't call "Neanderthals," but that we might claim as our more direct ancestors, didn't go "extinct" or have no story of extinction to tell. Don't you think?

[Aside: Here's where questions of cultural demise vs. continuity that are being addressed by archaeologists really might help. But again, we face problems because we know from modern examples that culture change does not equal genetic change and culture stasis does not equal genetic stasis.]

Further, and probably more significant here:  If Neanderthal "extinction" is the answer then it needs to account for the factoid that 23andMe says I have 2.9% Neanderthal (77th percentile for site users) in my genome which, as a Homo sapiens, is already more than 99% the same as a Neanderthal's.

There are at least 12 people who are more Neanderthal than I am.

I know that's confusing. I read the 23andMe methods paper, which is supposed to be simpler than the published one, but I still don't understand much about how they make the estimate.

Basically, it's about a percentage of SNPs (single nucleotide polymorphisms, a.k.a. mutations) that I share with dead Neanderthals but that so many live Africans (who I'm more closely related to!) do not. Therefore, if the methods are generally good, my genome contains evidence that people in my ancestry mated with Neanderthals. People who do not have these mutations either (a) never had Neanderthal mutations flow into their ancestors' families, or if they did  (b) those Neanderthal mutations drifted away before science could capture them from descendents today.

[You only got one mutation from mom and one from dad, the other part of the pair in each parent that you didn't get are dead ends (extinct!) unless your siblings or cousins got them. So a lot of SNPs and other variants disappear regularly and, on the other hand, everybody has new variants compared to their parents thanks to constant mutation.]

Tendrils of my ancestry must have been much more Neanderthal than 2.9%, but those SNPs drifted away over time. In other words, far enough back there had to be at least one hominin with a 100% Neanderthal genome in my ancestry (whatever that means), because that's the only way the genes got to me in the first place...but now they're diluted down, drifted away, and maybe even selected against to end  up 2.9% in me. I think that's about right.

These findings, that many people like me with ancestry from the northern hemisphere share small percentages of their DNA with Neanderthals, are not at all surprising to me. And that's for a couple reasons having to do with what we know about Neanderthals at this moment in scientific history, which in turn has a lot to do with why I titled the post the way that I did, which in turn has to do with my love of Young Frankenstein, and I think it's fairly common to think of Frankensteins and golems in the same imagination space as Neanderthals...

Walk this way

People are still studying Neanderthal feet and limb proportions to try to estimate energy expenditure during locomotion. But since we stopped basing all our reconstructions off an old man with arthritis, and a bunch of badass bone breaks that healed, we've accepted that Neanderthals are not clumsy, knuckle-draggers. They were good bipeds like we are--just coming into dangerously close contact with dinner and surviving well enough to string out their suffering before death.

Talk this way

Whether they had language is more of a lingering question but still one that's lop-sided towards yes, with the caveat that it probably wasn't as diverse and therefore wasn't as complex as ours. New research on a Neanderthal hyoid (small horse-shoe shaped bone in our throats that moves when we swallow and speak) claims that its structures reflect speech mechanics. But I really like reading about the work by Lieberman and McCarthy (written about broadly here) that explains how the Neanderthal throat and mouth dimensions probably did not allow for the tongue to move as much as ours does to manipulate expired air. This is how we make different vowels. Lieberman and McCarthy suggest Neanderthals couldn't have made as many distinct vowels as us and probably were as limited as human children in that regard. (Immature throat and mouth dimensions contribute to why kids sound like accented foreigners while they're developing.) Without as many vowel options their vocabulary would have been limited, but not non-existent! Surely they could produce something approximating this, no? Which brings us to...

Roll in the hay

So if it walks like a human, and sort of talks like a human, it probably bleeps like a human too. And our imaginations needn't feel naughty for going there since I already told you, if the methods are good, I carry evidence in every cell of my body that at least one of my ancestors waited until marriage to lose her virginity to a Neanderthal. (If you do want to feel naughty, read Ken's two recent posts here and especially here.)

***
They're so much like us or we're so much like them that we can't always tell their bones from ours! For a fascinating story on this, see Stephanie Pappas's piece "'Neanderthal' Remains Actually Medieval Human."

And yet you might see the latest news of "Neanderthal fossil indicates incest was common" which is about this article and say,  Hey! We're not like those incestuous savages! But ... well.. yes we are. We so are. And remember, we don't exactly have this kind of information from fossils that we welcome under the Homo sapiens umbrella and if we did (or when we do) I can all but guarantee we'll be finding some skeletons in those skeletons' closets too.

Neanderthals even took time away from incest to behave in some other pretty amazingly human ways. Scroll down to the bottom and check out what scientists have discovered about Neanderthal behavior just over the last year in this 2013 roundup by Kate Wong.

So despite the shrinking barrier between us and them, that we continue to call them "Neanderthals," sets them apart from us. It sets them apart from the real, or at least more human-y, Late Stone Age and Upper Paleolithic human hominins who begat us, whoever they are. And there's such a long tradition of differentiating them from us that it's hard to break free of the mold and present their story any other way than cloistered off as just that: "Their" story and one that ended before any of them could write it down. So they must have gone extinct, yo. Poof.

No seriously, which is it? Are they like chimps or dogs to us now, or were they like The French or The Red Sox of their day?

Maybe they're something else that we can't fully understand unless we actually encounter one another. So the best anybody can do is bring them to life from the inanimate material they left behind.

And because this is the best we can do, and because the fascination will always fuel it, the Neanderthal enigma can only intensify with more discoveries. It's so satisfying to say something conclusive at the end.

Wednesday, January 1, 2014

The non-biodegradability of plastic and the evolution of the human hand

Yesterday, on New Year's eve, we reprised a post we had done a couple of years ago.  That was itself a follow-up to our previous day's post on the nature of evolutionary speculation and reconstruction.  Today, on the actual holiday, we follow up on yesterday's repeated installment.  Yes, you might conclude that we're getting desperate to fill our pages over the holidays!

Well that's not really so, because our two previous posts were intended to be humorous, but also to make a point that is completely serious: our biological nature is often attributed to forces of natural selection in arguments that, to put it bluntly, do not always reflect the tightest of scientific reasoning--indeed, 'explanations' that sometimes are just fitted to a strongly held prior belief, an assumption that is accepted but not tested.  We think that is often not good science, or even not science at all.

Because the points are so highly technical and relate to the key subject of understanding human evolution and our behavior in particular, we'll take them in order:

Plastic is not biodegradable
Plastics are polymers made from natural carbon compounds such as we (today) take from fossil fuels.  They are resilient molecules with strength and durability, which is why we use so much of them.  In particular, however, they cause an environmental problem:  they are not biodegradable.  Discarded plastic litters our garbage dumps, but also our beaches, roadsides, picnic sites, etc.  Perhaps only chewing gum lasts longer (at least, under desks and on movie theater floors).

Everlasting evidence of almost anything we use! http://www.debgoesgreen.com/?p=1349
Unfortunately, because plastics hardly degrade naturally, they last for thousands of years, and with current rates of use that means a huge litter problem, danger to wildlife, and unsightly vistas.  But there can be a beneficial side-effect of the non-biodegradability of plastics:  as has widely been pointed out, our plastic remains will tell future archeologists much about how we lived today!  By the same token, we can turn that to our advantage today, by seeing what plastics tell us about our own past--in particular, our evolutionary past, and in ways you probably hardly realized!

The human hand evolved 'for' something
A characteristic of human beings that has attracted a lot of attention from paleoanthropologists is our hand.  Our dextrous hand enables us to do many things, such as type, pick our noses, and use tools of various sorts.  It is so very handy, so to speak, that it simply must have had some origin due to natural selection.  It couldn't have evolved just by chance.

Yesterday, in a reprise of a post from 2011, we examined the clearly dextrous nature of the fossil hand-bones of the species named Australopithecus sediba.  We argued that the species was wrongly named, in articles that hyped its importance in some ways on the thinnest kind of evidence.  In particular, the long-fingered hand was assumed to be related to early evidence of tool use, despite the minor irritating detail that no tools were found at the site.

The ritual explanation for the human hand's evolution, as an adaptation for tool use, that is offered by anthropologists shows aspects of our culture more than it does of our understanding of evolution.  Natural selection works only if a trait's presence leads to greater reproductive success than its absence.  That means that, whatever the reason, the hand must have evolved (if selection was involved) for some use that increased reproductive success.  Aesthetics alone would not be enough!

The usual handy explanation. Source  

Instead of tool use, we suggested that it was equally, if not more plausible that the hand evolved for its usefulness in masturbation.  Self-pleasuring is a way of generating sexual excitement and readiness, we noted, and would be thus very closely connected to reproductive fitness.  It's a much more direct and hence plausible explanation for the origin of the dextrous hand than is the ability to hurl rocks at fleeing wildebeests or to bring down a bunch of berries otherwise too high to reach by mouth.  So we suggested that the species would be better named Australopithecus erotimanis, to recognize the fundamental role that self-attention played in the evolution of our hands (that, we quickly acknowledge, most clearly were used for tool-making at some later time in our evolutionary history).

Now, one might argue that to suggest that our hand evolved for self-enjoyment was demeaning or silly, trivializing the gravity of the need for true evolutionary explanations.  But this is not because the argument was weak--after all, sexual arousal is even more closely tied to reproductive success than hunting and gathering by stoned ancestors, which might just have evoked laughter in their intended prey.  But topics, or even words, like 'masturbation' are awkward in our society and not likely to be taken seriously in real science....or are they?

To make our point, back to plastics!  We wanted to stiffen our discussion of the use of the hand, based on the actual evidence of the A. sediba finds.  So what could plastics possibly have to do with it?

The plastic tool--that wasn't!
We launch our argument by noting that the original sediba investigators essentially inferred tool use among the creatures they found, despite the absence of tools at their fossil site.  But even the tool-use assumption has been made impotent, one may say, by the restricted definitions that are applied, by which 'tool' refers to hunting, gathering, or warfare.  It may seem natural, but as default arguments so often do, that has become such a reflex explanation that it led investigators to fail to recognize the vital importance of what else wasn't found at the site!

We ourselves fell into the trap.  We failed to mention, in our original post on these finds, that not a single dildo or vibrator was reported among the artifacts at the erotimani site.  Yet, as we noted above, plastics are not biodegradable.  If our ancestors had been using their hands to wield those tools for self-stimulation, surely we would have found them!  But neither their casing nor any of their metal electronic parts (the ones that vibrate) were found.  Not a trace!  Not even a travel case!

Now this is as hard a piece of evidence as one could want for our hypothesis.  Unlike the absence of stone tools and tool-use, the absence of vibrators shows that they must have been using their hands--not tools.  Supporting this fact is that not only were there no stone axes, but there were no stone dildos, either, which of course would have survived to be found today.   You might say that, as practiced tool-users, the erotimani would perhaps have fashioned wooden ones instead of stone, and wood would have decayed and not be found today.  But that argument doesn't hold (so to speak), because wooden ones would have led to very painful splinters which certainly would not have been good for reproductive behavior.  Ouch!

The best explanation must be true....mustn't it?
Now there is so much of a tendency to offer, and uncritically to accept, just-so adaptation stories, that surely we must agree that one of them must be the true one.  But how do you decide which is best?  And why would our effort at explanation be dismissed out of hand, so to speak?  How can one assert that we have not fingered the truth?  Perhaps the reason is not scientific at all, but instead is cultural--reflecting both our need for simple explanations and our particular sensitivities.  This possibility is easy to see.

If someone were to find a structure in beetles, or even oysters, that the organisms routinely used to stimulate their genital organs, and this were related to reproductive behavior, the evolutionary argument would be totally compelling and would be front-page material in the Times and Nature or Science.  Do you doubt that?  So why not in humans--unless this is about our cultural squeamishness rather than science!  Hunting tools are respectable in mixed company in our society, but humping tools aren't.

Upon close inspection, and seriously, our explanation is in every way as good as the usual ones.  We didn't write it just to wet--or rather, whet--your appetite with our suggestive, er, post (sorry! Many words on this topic have potential double entendres, and I'm finding it very hard to work on my post, which drains the pleasure out of it).  Think carefully about how present-day behavior is typically assumed to be the past's selective reason, about the often near total lack of actual evidence for invoking a specific adaptive explanation, especially in regard to vague things such as behavior, but often about structure as well.

So, we challenge anyone to seriously say that our explanation is not at least as good, and at least as closely tied to evolutionary fitness--that is, reproductive success--as other explanations.  On what grounds?

Life is complicated by the fact that most structures have more than one function today, and not all of them need have evolved at the same time, though the tendency is to pick one of them and insist it was in the long, distant past the reason for the structure's existence today.  Fragmentary or statistically weak evidence, today or from the past, is too often accepted if a nice story can be told by it.  We think this isn't the best that science should offer and that, except perhaps in sexual affairs, restraint would be a better policy.

Even indirect evidence, such as the non-biodegradability of plastic, may tell the tale.  You never know.