Monday, October 15, 2012

How old is 'old'?

A story in last Friday's Science by Ann Gibbons concerns various recent, direct estimates of the DNA mutation rate in humans, and its import for the way we reconstruct our geographic and demographic origins.  The point is that new estimates are that mutations, nucleotide by nucleotide each generation, occur only about half as frequently as earlier estimates.  Instead of about one per 100 million nucleotides the estimates are about twice that.  How could numbers so small have any import at all?

The timing of various aspects--here we'll loosely call them 'events' even if they occurred very gradually in human lifetime terms--of human prehistory is at issue.  When we look at human variation today, and compare it to that in other primates, we try to account for how rapidly new variation arises.  If we take into account estimates of population size, the rate of new mutation is counterbalanced by loss due to chance, and the amount of standing variation within our population, or of difference from our nearest relatives, can be used to estimate the timing of expansions and migrations and geographic variation within our own species, when we diverged from those earlier relatives as a new species, and when/where/if we split into sub-groups (like Neandertals) that then interbred or became extinct.

The standard story had been that we diverged from chimps about 6-7 million years ago (Mya), and that the Neandertal group had started 400 thousand years (Kya) or so.  New direct estimates of mutation, on the order of 1 per 10^8 nucleotides, are slower and hence imply longer time periods for these inferred events in our history.

That would make no important difference, unless thinking that humans diverged from chimps 9 rather than 7 million years ago matters to you.  That's rather abstract, at best.  But there is some import in these estimates.

They are not fixed in stone, either in terms of their accuracy as averages from relatively sparse data (for example, mutation rates vary along the genome and between individuals), or over historical time and environments.  The per-year rates involve body size and hence generation length, and may have differed in the past.  The mating pattern may matter, since older males have had more sperm-line cell divisions in which mutation can have occurred, so populations with, say, older dominant males mating more often will have higher mutation rates per generation.  And so on.

But things really matter most when it comes to being consistent with the fossil record.  Fossils have tended to suggest more recent event times, and mutation rates that are often based on such times as calibrating points have been fitted accordingly, at least in part.  Of course, by the time morphology shows recognizable species differences, the species have already been clearly split for an unknown length of time.  Likewise gene-based separation times usually underestimate actual isolation events, for well-known reasons.

If, for example, we now don't know when human ancestors and/or Neandertals left Africa, our estimates of their history of admixture (or not) will be affected.  They could not have admixed if they hadn't evolved or emerged from Africa!  Right now, paleontologists are not easily agreeing with revised mutation rate estimates, but clearly the two kinds of data must be brought into agreement.

Probably, there will be little difference in the overall picture we get from fossil and genetic data on past and present variation.  The times will matter.  If that leads to other issues, they will have to be faced.

The issues arise because we're getting better data from genomic sequences, so they are legitimate--they are not just stubborn food fights among different research groups.  Whether they will make any substantial difference in our understand of our evolution as a species seems less likely, but remains open.

No comments:

Post a Comment