Wednesday, November 30, 2011

What we REALLY evolved 'for'!

We have posted a few times about Just-So stories that take classical, perhaps obvious-seeming, explanations for how this or that trait evolved or what it was driven by natural selection 'for'.  Of course, these are often, perhaps usually, made up out of nearly whole cloth.  It is usually based on assuming that today's function was yesterday's adaptive reason: that what is good today, was good for sexual success yesterday (because that is the only thing that matters in regard to adaption by natural selection)

We have speculated that the gracile hominid hand found in some of the earliest fossils of our ancestors was not necessarily adapted 'for' tool use, or cooking, or hunting.  We gave it a different explanation that was even more plausible than these staid classical explanations (refresh your memory here).  

That was about adaptive evolution of human ancestors. But what about our closest relatives, with whom we share a reasonably recent (about 8 million years) common ancestors:  chimpanzees?  What did they evolve 'for'?  Let's concentrate where most anthropologists do, on locomotion.  Chimps can walk upright after a fashion, and although that's not what they do all the time, there have been various highly confidently asserted explanations about what chimp locomotion evolved 'for'.

Well, MT readers, we have another Bulletin of the Truth for you.  Only this time we are not going to rest our case on our own type of speculation.  We have direct, right before your eyes, evidence:

Finally, the truth will out!  Evolutionary explanations clear, simple, and genuine.  No more guesswork.

Tuesday, November 29, 2011

Understanding pathways leads to drug discovery. Really?

One of the most often heard rationales for studying the genetics of disease is that increased understanding of gene pathways, or the products of gene pathways, will lead to new drug discoveries.  The idea is that even if mapping studies, like GWAS (see many of our earlier posts) don't discover 'the'  or 'blockbuster' genes 'for' a trait, they reveal genetic pathways in the biology of the trait that drugs can target.

However, while there have been some successes or hopeful trials, overall there is only limited evidence yet that this is as yet happening -- in fact, after a burst of initial enthusiasm for the new doors opened by the sequencing of the human genome, pharmaceutical companies have been cutting back on research and development, and few new drugs are currently in the pipeline -- drug discovery is risky, and payoffs are few.  This constriction of the new drug pipeline is why Francis Collins, director of the National Institutes of Health, has pushed "translational medicine" and NIH's involvement in drug development.

Well, as the special report on allergies in the Nov 24 issue of Nature notes, immunologists and allergists have long thought they understood what causes asthma, but the disease has increased dramatically in the last 3 decades or so, and new treatments are few and far between.  So much for the benefits of understanding pathway components. 
Since the discovery of immunoglobulin E (IgE) almost half a century ago, there has been a massive expansion in knowledge about how IgE antibodies work. Research has unravelled IgE's role in a myriad of cellular and molecular targets driving inflammatory responses and underlying complex allergic disorders. This knowledge might have been expected to lead to novel preventative and therapeutic pathways — unfortunately, this has not been the case.
The dramatic rise in allergy and asthma worldwide has increased the clinical need for treatment, but research focusing heavily on IgE as the main malefactor in allergies has not been translated into widespread patient benefit.
One problem, according to a piece by Stephen Holgate on why this is so, has been the pharmaceutical industry's reliance on animal models to both better understand the disease as well as to test new treatments.  But, humans are not mice.  In addition, allergic disease is complex, and involves not only biological pathways, but their interactions with environmental triggers as well as, presumably, an underlying genetic susceptibility.
Traditional therapy of allergic disease has in large part relied on the abatement of symptoms with H1-antihistamines (rhinoconjunctivitis, food allergy, urticaria), adrenaline (anaphylaxis) or β2–adrenoceptor agonists (asthma), and the suppression of inflammation with corticosteroids. Besides improving the pharmacology of known drugs, the only novel asthma therapies to emerge are leukotriene inhibitors (for example, montelukast) and the non-anaphylactogenic anti-IgE, omalizumab, both of which are directed at targets identified well over 40 years ago.
There have been disappointments with a wide range of biologics targeting activating receptors on T cells, cytokines, chemokines, adhesion molecules and inflammatory mediators. Having shown convincing efficacy in in-vitro cell systems and animal models, and possibly some level of efficacy in acute allergen challenge in mild asthma, all of these have fallen short of expectations when trialled in human asthma. In moderate–severe asthma, where the unmet therapeutic need is greatest, trials of novel biologics have revealed only small subgroups in which efficacy has been shown or is suggestive.
And, it has long been assumed that asthma is primarily an allergic response, but this is no longer thought to be so.  The idea now is that perhaps impaired innate immunity comes first, and leads to allergy.  So, much is known about allergies and asthma, but nowhere near enough.

The asthma question is one that highlights many of the current problems in epidemiology, genetics, and the understanding of causation.  Asthma prevalence has been climbing in the US and other industrialized countries since the 1980s.  Given its precipitous rise, it would seem that the problem is quintessentially one for environmental epidemiology, but even when looked at from that perspective, no convincing environmental cause has yet been identified, and in fact studies have produced a frustrating litany of contradictions -- it's cleanliness or dirt, breast feeding or bottle, this gene or that.  Yes, epidemiologists have turned to genetics to try to understand the disease, but clearly there's no genetic explanation for such a recent epidemic. 

Like most other common chronic conditions, asthma is a complex disease, with multiple causes and multiple pathways.  As Holgate concludes:
In the future, it is essential that asthma is not treated as a single disorder, but rather defined by causative pathways. We need new diagnostic biomarkers to identify patients most likely to respond to highly selective biologics, such as anti-IL-5 biologic (mepolizumab) and anti-IL-13 (lebrikizumab). These therapies are only active in particular subtypes of asthma, when the molecules they target lie on a causative disease pathway. 
Studies of large numbers of people with a common chronic disease like asthma, or heart disease or hypertension, are necessarily pooling cases with different causes, pathways, genetic backgrounds, and outcomes.  This limits the potential for successful findings.

Biological traits are the result of interactions among many different factors, genetic and otherwise.  Such interactions, and the way that evolution works, leads to redundancy, alternative pathways, overlapping pathways, and the like.  This was a major theme of our book The Mermaid's Tale.  Complexity is an easy word to say, and perhaps it's easy to use it to excuse failure to find blockbuster findings.  But the last couple of decades have systematically shown causal complexity to be real.

Besides complexity itself, a major problem is not simply that humans aren't mice.  It's also that we are all unique in our environmental exposures, genomics, and how our bodies respond.  Identifying single genes that may be involved in complex diseases, or even biophysiological pathways, may be a rationale for sticking with the genetic approach to understanding disease, but it's a far cry from prevention, treatment or cure.

Rather than promising simplistic causation and consequent intervention miracles, we feel that students and young investigators, and the funding mechanisms, should be geared towards coming to grips with complexity, rather than just spinning out ever more details.  Major practical, and we also believe conceptual challenges lie ahead.  Asthma is a good case in point.

Monday, November 28, 2011

The Penn State Sandusky scandal: What is 'normal'?

The Penn State scandal involves alleged pedophilia by a famous defensive football coach Jerry Sandusky and has led to indictments against two PSU administrators and the dismissal of legendary coach Joe Paterno and Penn State's money-raising, high-visibility president, Graham Spanier.  The dismissals were for failure to take pedophilia (that is, rape of young boy(s)) that happened on campus seriously enough and, instead, to act effectively to cover it up.

This scandal has disgusted essentially the whole country; people have recoiled at the horribleness of the accusations of multiple pedophilia associated with a charity foundation, The Second Mile, that, as the whole country knows by now, Sandusky set up to assist troubled kids but that he is accused of using to find his prey.  There are also mysteriously disappearing police records, disappeared financial records for the Second Mile, and even a disappeared District Attorney involved in earlier cases.

Pedophilia is, to most of us, an awful act, and, among other things, a betrayal of the trust children have in adults.  But it has become something that can be discussed openly more so than in the past, largely due to the Catholic Church and its covering-up of a pedophilia scandal that is much larger than the Penn State scandal, but reminiscent in many ways.

Stories are now appearing that report that up to a sixth or even a quarter of children have been sexually abused at some time.  Even if the urge to report or build up the nature of the problem is fed by media attention, the reports of those who treat people who were sexually abused as children seem consistently to suggest that these experiences leave lifelong personal scars.

As awful as these stories are, they raise the question as to how something so awful could be so common.  After all, this is at least physically a form of sexual behavior which strongly Darwinian worldviews would suggest could hardly be favored.  In this view of the world, sexual behavior should be driven by reproductive success, and any genes that predisposed to nonproductive sexual behavior should have been removed by natural selection.  Similar issues have been raised about homosexuality: sometimes-contorted arguments are made to show how homosexuality can be favored or at least tolerated by natural selection.  Yet, in our culture at least, these behaviors are common.

The Darwinian argument derives from a rather simplistic idea about sex and sexuality.   There are two physical sexes, specified by specific chromosomes (XX makes female, XY makes male).  These two types have to mate to reproduce, so one should expect there to be two corresponding, genetically programmed types of sexual behavior to bring that about.  Obviously, this is not what we are seeing.  Indeed, the figures show at least three major things.

First, one can argue that our culture somehow over-rides our evolved biological mandates to be 'real' males or females, and mate only with our opposite sexes.  Behavior, often discussed in terms of 'gender' rather than the more anatomy-connoting term 'sex', should accord with that.  This is a desperate fallback argument if you are a strong Darwinian-genetic determinist, because then anything can be explained by saying that the evolutionary determinism is real but culture somehow artificial.

Second, it is clear that there are not just two sexes or two genders.  Sexual anatomy does generally fall out into to categories, with some variation in each, but there are also rare mixed instances of dual or dysgenic sexual anatomy.  Clearly variation in gender behavior is more common.  Personalities of males and of females both vary considerably in ways related to stereotypical ideas of how a 'woman' or a 'man' behave.  Gender behaviors have a distribution, that is, the fraction of individuals who manifest a given behavior across a range of observed behaviors.  As well as a range of behaviors found specifically among males and among females, the male and female gender-behavior distributions overlap.

Third, this raises the question "What is normal behavior?"   If 10% of the population are homosexual, and up to 25% experience pedophilic abuse (suggesting also that a substantial percent of adults are sexual abusers), then does this not become part of the normal--that is standing--distribution of gender or even sexual behavior?  But, again referring to the evolutionary question, how can this be so? 

One recoils from referring to this as 'normal', and we put people in jail for child abuse, and say that they are 'diseased'--or at least criminal.  But why?  If there is some cutoff beyond which the behavior is a crime, who decides what that is, if not the arbitrariness of culture?  For example, sex with boys and consummated marriage with pre-teen or barely teen girls are reportedly common in various societies around the world, where they are in that sense completely 'normal'.  Are they associated with the same genotypes that we define as diseased or criminal?

One can argue that diseases like, say cancer, are abnormal even if a high fraction of people experience it, because they kill.  That seems at least a reasonable definition.  But is pedophilia abnormal in that kind of sense, or is it more like being very short or very tall in society (both of which can in various ways be harmful, physically or psychologically)?  Is there some actual neuropathology involved? 

Are these sexual or gender-related behaviors 'genetic' in nature?  If not, and if they are part of the normal range of behaviors, then our reactions are in many ways themselves cultural ones, and in that sense arbitrary.  But how are the lines drawn--based on what criteria?

We like to fancy that behavior that is so immediately disgusting is wrong in some absolute sense, perhaps because it causes psychological trauma, but that kind of conditional specificity seems unlikely. It is difficult in our context to have, much less to express, sympathy for the perpetrators of pedophilia  (or those who cover it up).   That would entail a coldness to its victims, who have suffered what our society, and indeed most of us, tries to avoid or prevent.  But if one tries to be objective about it, even pedophilia raises questions about causation, determinism, what is 'normal', and evolutionary ideas about what organisms evolve 'for'.

Wednesday, November 23, 2011

Age-old questions that keep getting older: are they being solved?

Once infectious diseases were conquered (or so we thought), roughly just after WWII, the epidemiological community turned its attention from point-causes like infection (where the cause can amplify within the body), to the gradual causes of disease due to lifestyle and exposure to chronic agents like chemicals, pollution, radiation, dietary excess and the like.

The idea was that such agents, if identified and removed from the environment, were the causes of disease and death that prevented us from reaching our 'lifespan potential' as it was called.  Then, in the last 50 years of luxury, when famine and plague were not big problems in the industrialized world, aging itself became the target of substantial research.  Why not?  We can manipulate everything else about the world as we choose, so why not create the ultimate manipulation: immortality!

Since we in science need not just to identify causes and study their nature, but to proclaim ourselves geniuses with fundamental new theories (and the research funding we thus deserve), there have been many grand one-cause 'theories' of aging.  Somatic mutation, apoptosis (programmed cell death),  the loss of telomeres (chromosome ends), the idea that cells can only divide so many times before becoming permanently senescent (called the 'Hayflick' limit), and others: antioxidants prevent cancer and other diseases, and so on.

There are many reasons why no one cause can be responsible for aging, reasons both evolutionary and due to how genomes construct and maintain organisms.

Now, a new study published in Nature and described here, reports the latest miracle claim.  This refers to the Hayflick limit in a sense: it is that seenescent cells that are past their dividing prime, send off signals targeting them for self-destruction, which leads to inflammation, which causes aging-symptoms.  From the abstract to the paper:
Cellular senescence, which halts the proliferation of damaged or dysfunctional cells, is an important mechanism to constrain the malignant progression of tumour cells. Senescent cells accumulate in various tissues and organs with ageing and have been hypothesized to disrupt tissue structure and function because of the components they secrete. However, whether senescent cells are causally implicated in age-related dysfunction and whether their removal is beneficial has remained unknown. To address these fundamental questions, we made use of a biomarker for senescence, p16Ink4a, to design a novel transgene, INK-ATTAC, for inducible elimination of p16Ink4a-positive senescent cells upon administration of a drug. Here we show that in the BubR1 progeroid mouse background, INK-ATTAC removes p16Ink4a-positive senescent cells upon drug treatment. In tissues—such as adipose tissue, skeletal muscle and eye—in which p16Ink4acontributes to the acquisition of age-related pathologies, life-long removal of p16Ink4a-expressing cells delayed onset of these phenotypes. Furthermore, late-life clearance attenuated progression of already established age-related disorders. These data indicate that cellular senescence is causally implicated in generating age-related phenotypes and that removal of senescent cells can prevent or delay tissue dysfunction and extend health span.
It is interesting that many genomewide disease mapping studies (GWAS) for diverse chronic and late-onset diseases, have found genes related to inflammatory responses to be involved.  Yet the diseases ranged from cancers, to diabetes, to digestive disorders, to loss of mental function.  So the current study  may be helping to explain that.

However, several caveats must be added.  First, these are manipulated laboratory mice, not 'real' animals, and not humans.  Secondly, as the authors themselves note, the aspects of aging that they studied were only one set of things that goes sour over time, so this is not 'the' theory of aging, even if it is supported by further work. Thirdly, if we avoid the diseases that are currently the targets of so much research, we will be left with a drawn-out process of physical and mental degradation that may likely be even worse than what we experience already.

The Be Careful What You Wish for Department:
In the past, pneumonia was often called the old person's friend, because when you were only a shell of your former self, and dependent on others just for day to day survival, pneumonia killed you quickly and in that sense with minimal suffering.  Anyone visiting a nursing home might feel nostalgic about those good old days.  And fourthly, if people lived much longer, the problem of world overpopulation and of too many depending on too few younger people to care for them, will be much exacerbated--and that will lead to other kinds of negative payoff: deeper poverty, wars for resources that involve massive suffering, and so on.  So even a single aging theory would not be an unmixed blessing of more person-years of misery per person, or even worse.

There are many other interesting facts that need accounting for as well.  Mice are orders of magnitude smaller than we are, with fewer cells to become senescent or acquire mutations.  They acquire similar diseases to those we suffer, on a similarly accelerating pattern with age.  But this occurs over only two years, while for us it takes 60 or more.  This relates to the general relationship between body size and longevity among mammals. So other more basic issues of mammalian biology must be involved--an issue we've been aware of and that one of us (Ken) even did research on decades ago, in regard to cancer.

Aging has been programmed by evolution only to the extent that longevity for evolving species was long enough for it to evolve.  Anything genetic that trimmed individuals away 'too soon' for their local circumstances, was trimmed away as well.  The aggregate of what worked, in combination, well enough is what is here today.  It need not be simple, nor is there any reason to think it would be--and a wealth of evidence to suggest that these things really are complex (GWAS results, for example).

As we went to great lenghts to try to explain in our book MT, this almost has to be the case (rather than single-causation) because of the complex nature of gene-gene interactions and so on.

So there are interesting biological and evolutionary problems to address, as well as the personal ones related to how long each of us is going to last.

Tuesday, November 22, 2011

Who needs DNA when we've got fMRI?

Neuroscience is the new genetics, the explanation for everything that ails you, and for why you are who you are.  Where your brain 'lights up', as caught on fMRI scans, when you're doing some task -- lying, serial killing -- apparently tells the researcher a lot about you. "It wasn't my fault, my brain made me do it," may be a defense soon coming to a criminal court near you.

fMRI, amygdala in red
The story of the perfectly normal man who slowly and stealthily became a pedophile, finally caught by his wife when he began to molest his 11 year old stepdaughter, is only one of many that are causing people to rethink the idea of responsibility for our behavior, and indeed, free will.  This man came to the attention of a sympathetic physician, who decided he needed an MRI, which showed that he had a large tumor on his frontal lobes.  When the tumor was removed, the man's pedophilia disappeared entirely.  When he began to be interested in children again, it was found that the tumor had recurred.  The tumor was again removed; the legal charges were ultimately dropped against this man, deemed to not be responsible for his behavior.

This story was repeated on a BBC Radio 4 program about understanding the criminal brain, and it gave us occasion to muse once again about causation.  Neuroscientists are doing a lot of work on what is malfunctioning in the brains of sociopaths.  And how can you doubt their word?  You can see it on scans.

Serial killers have less activity in their amygdyla, which leads to less empathy, which leads to antisocial behavior.  And, according to the show, this aberrant brain activity can be identified in severely misbehaving children, kids, they claim, who are likely to grow up to be criminals.  (Will this lead to preventive measures taken by the state to lock up or dope up those whose 'brain will make you do it'?)

A school in England is now addressing this problem head-on (so to speak).  Armed with fMRI scans of kids with serious behavior problems, the malfunctioning part of the brain identified, the school has developed a method to work with these kids to activate the caring parts of their brains.  So, for example, they show them pictures of fearful kids and teach them how to tell how the kid is feeling, both from the face and from body language, clues that the students aren't good at picking up.  The hope is that these brains are rewireable, so that the kids will grow up to be contributing members of society.

But there's a kink in this analysis.  It is both reductive, and of course reminiscent of claims of behavioral geneticists -- my brain structure, or my genes, made me do it -- and expansive at once.  If the criminal (or future criminal) brain is plastic enough to respond to environmental pressures in the form of teaching, and it can in fact remold itself, then that suggests that something malformed it in the first place.  And here we're back to the age-old questions of social causation, and then of course ultimate responsibility -- does poor parenting make criminals?  Poverty?  Bad luck?  So, the reductive explanation is not a complete one after all.

The problem is simple, even if the solution, if there is one, isn't.  After the 'bad' behavior has occurred, one seeks any tecchie gear that will show why, in a reductionistic way (bad gene, tumor, --- some built-in molecular 'cause').  And even if it's clear that non-technical experiential factors can fix the problem, it is not so easily recognized that such factors may have caused the problem.  The random population is not screened to see how many non-sociopaths have the same brain activity patterns, for example.

More importantly, if it is assumed that the technical finding (genotype, fMRI signature) is causal, a likely consequence, if history is any guide, will be to use the same findings 'preventively'.  That was the basis of the eugenics movement, and anybody who thinks something of its ilk can't happen again hasn't read any history.   Eugenics -- the very term itself! -- was the purportedly benign attempt to improve society through science.

Even if that risk is not taken seriously, we still have serious problems in dealing with this kind of causation and association data, or of putting non-technical facts in the same perspective as we put technical ones: we're too enamored of the latter, in this Age of Science.

Monday, November 21, 2011

Light still cannot be shed on a speeding neutrino?

A while ago there was the report that neutrinos may be able to travel faster than the speed of light, an ultimate no-no in science.  That's worse than DUI or speeding on a highway. As we posted a while back, it violates not just some local ordinance, but a century-old prohibition that should hold true anywhere in the universe!

Now, the same lab has reported that they've replicated the experiment, again finding the neutrino to be committing its crime, flying faster than a pepperoni pizza disappears in a frat house.  Some say that a different lab will have to do this, because maybe the perps are not the neutrinos but the lab set-up.  This is serious business!

The Laws of Nature simply do not allow anything to travel faster than the speed of light (even a cosmic cop car).  Indeed, going faster would be hard to document ordinarily because light must be used to track the damn thing flying by, and light can't catch something that goes even faster.  We don't know if neutrinos produce the equivalent of a sonic boom from an object exceeding the speed of sound.  Nor do we know what makes light (much less neutrinos) actually go anywhere. But apparently they (light photons, anyway) are so ever dissatisfied with where they are that they cannot stop zipping away, and for some strange reason, they only go in high gear.

There can be no violation of a Law of Nature
Of course, there can be no violation of a Law of Nature.  That's because a Law of Nature by definition is something that cannot be violated--anywhere, any time.  So if the neutrino is doing what is reported, it is not violating a Law of Nature.  Instead, we simply had been misreading the 'Law'.  We'll have to revise our theories of physics.  Of course, it is our notion that such laws exist, but it would be strange if there were no universal principles in the universe!

If we've misunderstood things, there will be some other factor, force, or basic principle that we will have to identify that explains what we now know.  Since so much has seemed consistent with the speed of light being constant and maximal, in global experiments around the world, the new factor may apply only under very unusual circumstances, or else why didn't we sense its existince?  It could also be that now we know to look carefully, we'll find that the implications are much broader than just such rarities, and will fill in or modify other things that were difficult to explain in the past, Einstein notwithstanding.

This experience can apply to any field.  Biology is the same. We may think we have principles analogous to 'laws', or at least that are fundamental to all of life.  But at present we find few principles that don't have exceptions.  Those who argue explicitly or in practice that, in essence, evolutionary biology and genetics have now revealed the basic principles of life (even if we have yet to learn scads of details), are perhaps making rather bold claims.

Beyond the current misbehavior of neutrinos, if the history of science teaches us anything it is that there will always be (or, at least, has always been) surprises around the corner: fundamental new discoveries that force major changes in what we thought we knew.

As we have said before, this reality makes it hard to interpret problematic results, such as the incompleteness of GWAS studies, or the difficulty in testing evolutionary adaptation stories.  The arguments about these things, including statements by us and those dropping comments here on MT, are all based on the value of some generally agreed-on parameter such as the strength of genetic causation, the amount of natural selection, or the probability of a disease for people with a given genotype.  

Even in our most vehement disagreements about such things, we don't argue that the explanatory elusiveness is due to some unknown, as yet undiscovered factor or principle.  This is very strange, given that all the evidence shows the near inevitability of some transformative discovery at some point in the future.  Why don't we invoke Factor X (biological 'dark matter') to make our point?

The obvious answer is that we would not be constrained in what kind of Factor X we might invent, and we could always be able to invent one that would perfectly satisfy our point of view in a dispute about the facts.  Science is supposed to be the quest to understand the true causal nature of Nature, staying within things that are material and testable.  So, while we know that Factor X's are likely to be discovered, we're generally not allowed to invoke them, even in principle, although some examples that come close to that do exist, such as fudge factors like Planck's constant in physics (or string theory in general?).

In biology, genes were invoked before anybody had the foggiest idea what they were, and some viewed them as made-up metaphysical notions.  But that notion was specific and led to experiments that, in fact, showed that they exist.   We're allowed to include factors such as statistical 'noise', that quantify deviations from our theory and the actual data, but they assume something like random irrelevancies such as measurement errors.  Statistical noise thus isn't a fudge factor invoked as being anything real, and in a way is an open-ended excuse for clinging to an explanation that doesn't quite fit the data.

The likelihood that fundamental factors or principles do exist but that, like some items of clothing, are unmentionable, puts us in a bind--like trying to track a speeding neutrino that is traveling faster than the light we have to use to track it.

This prohibition is at the core of science, and is necessary, but it's very weird!

Friday, November 18, 2011

Triumph of the Darwinian Method, further thoughts

We recently posted on Michael Ghiselin's book The Triumph of the Darwinian Method, because although this is a decades-old book its ideas are cogent, still timely, thought-provoking, and well expressed.  This is a follow-up.  We don't want to do a review of a 50-year-old book, that we stumbled upon when one of us (Ken) was clearing out a storage space so it could be demolished as part of renovation of our building to accommodate a new faculty member here.  But browsing the book led us to reflect on a few issues, because while there has been a ton of evolutionary history and philosophy written since then (and before)  this one's still worth reading.

The title is about method, and much of his discussion is of the way that so much in biology can be analyzed by considering the accumulation of traits over evolutionary time, and the divergence of organisms (and the patterns that make them) from common ancestry. This is certainly correct.  But Ghiselin seems to go further, to suggest that Darwin's greatest contribution was natural selection, and he discusses in what seems to be a rather inconsistent, though interesting, way what we mean (or Darwin meant) by 'laws of nature' and whether or how such things actually apply.  Inconsistent in that there are different ways to define what a 'law' is or whether we in science are actually seeking to identify such things.  Ghiselin takes what seems to be a fully pragmatic 'engineering' approach to science.  He didn't seem to aspire to understanding the real laws of nature, but to be satisfied that any time in history can only have its best-current models of things that frame research designs and interpretation.  Efforts at deeper exploration he seemed to suggest are rather metaphysical.

One can differ about whether this modest and pragmatic goal is the important or indeed perhaps the only possible goal, as opposed to the conceit of hoping to understand the true nature of Nature.  But his book provides a very sophisticated view of the nature of the method of thought that Darwin introduced.  Published in 1969, nearly 45 years ago, this was before we knew nearly as much as we do now about genes and genetics and the evolution of genomes.  But the fact that the arguments are still cogent shows that despite a lot of new knowledge, the conceptual framework has not required basic change.  That in itself is a profound thing to understand, regardless of how much one might quibble with, or complain that most people are not quibbling with, in genetics and evolutionary biology today.

Darwin is credited with using the hypothetico-deductive (HD) method, which is correct as far as it goes, because much of biology was rooted in other kinds of metaphysical worldviews.  Creationism, in that context, has no 'method' because it claims perfect truth (so it also isn't 'science').  But Darwin didn't invent the HD method. It was being developed in physics, chemistry, geology, and even in some areas of biomedical sciences.  And others in geology and even also in biology were using similar methods at the time.  Even if the term itself wasn't invoked.

So Darwin was a product of his time in that sense.  What made him so influential was that life had been considered, in a way, as being somewhat outside of that  kind of analysis, and he showed how to bring it under the same tent.  In particular, of course, he made it clear to most people in science that humans were part of Nature: our exceptionalism as claimed by religious doctrine was simply incorrect.

The 'Darwinian' method is powerful, but the history of its use also has a more questionable side.  And in that, Mendel and the founders of the 'modern synthesis' are inadvertent co-conspirators.  We have been given such strong conceptual guidelines by the triumphant legacy of evolutionary thinking, that we are often rooted in viewing life in highly deterministic, selectionist and mendelian terms that we have ignored one of the most important aspects of really innovative science.  We intensely study what we think we can see implicitly and often even explicitly through these one-explanation lenses.  But that means we aren't paying nearly enough attention to what's 180 degrees different from that:  what we don't know or  perhaps we cannot explain through these particular lenses.

Nothing we know of is inconsistent with evolution as an historical process by which life has coursed through its history.  But clearly not everything is due to natural selection, and genes are clearly not as deterministic as is usually assumed without question.  Instead of continuing to do more of the same, and declaring victory in the battle when there is really little more than some small skirmish victories, we should be somehow encouraging bright younger scientists to think about other aspects of biological causation.  It's been a common theme here at MT.

What kinds of 'laws' apply to biology is a profound question, in particular given the finite nature of populations and the role of probability in the action of genes, and the survival and reproduction of individuals, in a probabilistically changing environment.  Most theoretical treatments sweep much of this under the rug (if not entirely away) in order to get a feeling of mathematical rigor in evolutionary explanations.  But we know that this is imprecise at best.  It seems likely that the 'Darwinian method' of studying biological function as a product of history (including the organs and cells in a body as a product of its individual life-history) is correct.  But that's not enough.

However, don't look here for answers.  Look to students--and do your best to confuse them with enough uncertainty and nervousness about accepted answers, that they aren't afraid to ask:  "What if it isn't true?"**

**But here we are not suggesting that there is even a scrap of actual evidence that the basic idea of evolution and life as a historical, material process isn't generally correct.

Thursday, November 17, 2011

Old News is Interesting News

We continue to comment on things related to paleontology.  This is not our field, and we've commented mainly on what we believe are premature if not insufficiently baked speculations about what evolved for what in our early history of a species, around 2 million years ago.  But there is a more recent (paleontologically) story that is of interest and is closer, at least, to what we at least know something about.

For decades there have been debates about whether or in what sense Neandertals, a kind of fossil relative of ours that existed from about 400,000 to about 40,000 years ago in Europe, and similar creatures elsewhere, co-existed with our immediate ancestors.  Neandertal-looking fossils disappeared 40,000 years ago, but more modern looking fossils seemed to have become common about that time.  The question has been whether these were different species, where and for how long they might have been cohabiting contemporaries in Europe, and whether there was any mating between them.

Not long ago a relatively complete set of Neandertal DNA sequence data was reported, that confirmed that overall Neandertals seemed to have a deeper but somewhat separate common ancestry relative to us modern humans.  But bits of the Neandertal sequence seemed too similar ours to be part of that picture.  That suggested that there was, in fact, some admixture.  In turn, that means that Neandertals and our 'modern' looking ancestors weren't separate species in the clear-cut classical sense.  But the evidence is not overwhelmingly strong and it's statistical in the sense that common ancestry times and degrees of relationship among DNA sequences have to be judged in relation to probabilistic models of deep population history.

How that apparent admixture happened is a matter of demography--population size, location, contact and mating patterns.  If there were doubts about the fact of contemporaneity of the two groups, then clearly they could not have mated, and the genetic evidence would need some re-thinking.

Now there are reports of a human-ancestral fossil from England that has been recently dated to about 40,000 years, the first or oldest known modern-looking fossil in that part of Europe.  This is touted in the media, as usual, because all stories are touted as greatly as the investigators and co-conspiring journals can manage.  Still, if it is a 'modern' ancestor--the fragmentary nature of the bits of tooth and bone make that still open to discussion, it is evidence of the potentially long (thousands of years) co-habitation of modern ancestors and doomed Neandertals in Europe.

If the evidence holds up, the evolutionary dynamic question is what extent and kinds of contact and interaction would be expected to have taken place, and whether this is consistent with the sequence data as we described above.

The story, like all such stories, gets a lot of press ink, but one thing is important to think about before we get too excited.  There are already substantial differences among modern humans from South American to Africa, peoples who have been separated for around 100,000 years.  But they are clearly not separate species.  When people from one location travel to another, as Europeans and Africans and Asians have come to North America for example, they naturally mate.  There is admixture--but not species admixture.  So, first of all, Neandertal-premodern mating would not be a shocking surprise.

Secondly, the definition of 'species' is well-known to be problematic.  We know rather clearly from the Neandertal sequence data already available, that the Neandertals were about as close to us, relative to our mutual relationship to common ancestry with our closest relatives the chimps, as we would expect from the fossil dates.  So, if Neandertal sequences were from 40,000 years ago, and Neandertals split 400,000 years from our own lineage, but we had about 8 million years of overall common ancestry, you'd expect that 400,000 years ago, when we and the Neandertals were one species, we were already 15/16ths identical to each other compared to chimps (because 400,000 is 1/16 of the way from 8,000,000 years ago to today.  We were all essentially 'human' by then.

Thus even by the most species-separating, melodramatizing advocate of 'admixture', the Neandertals and humans were essentially already the same species in terms of their overall DNA.  Whether or not we 'could', or just didn't, inter-mix.

Of course if Neandertal and our various ancestors never lived nearby, then of course no sort of direct 'admixture' between the two could have taken place.  So if the DNA suggests something like that did occur, and if the existing fossil dates, reinforced by the dating of the new British find (no tea was found at the site, by the way), at least the opportunity for mating was there.  Opportunity may be nine points in the law, and the DNA suggest at least some admixture.

Still, thinking about it, while it's very newsy, there is nothing particularly scientifically important about the question from a genetic point of view.  Yes, there could be traits found in Neandertals but not in our lineage.  Bigger brow-ridges may be an example.  But we vary among ourselves in important traits (like disease susceptibility), and clearly in many physical traits as our 'racial' variation shows.  This may have a lot to do with genetics, case by case, but it has very little to do with species or admixture.

For these reasons, despite the genocentric focus of all the news, the really interesting questions are not  genetic ones.  They are cultural ones.  What were the Neandertal and pre-modern populations like in size, behavior, environmental preferences, language, technologies, and so on?  What were their mating patterns?  When they saw each other, what kinds of relationships did the two groups have?  If there was intermating, were there recognized kinship affiliations among the adjacent villages of the two groups?  Did they share religious, body-decorating, food preparation and preference, and so on patterns?

It is these things, not genes, that could have been far more important in determining what happened.  It is true that genetic differences, even in something like body smell, could have inhibited mating.  But why do so many go so far out of their way to suggest things, than to try to find cultural explanations?

In part this is because genes mean grants, the news media eat it up, and they're the causes-du-jour of our time.  It is possible, certainly, that there were genetic barriers (even our favorite self-promoting criterion of intelligence). But then to suggest that and yet see some intermating evidence, leads to forced explanations (such as, essentially, rape of Neandertals by aggressive, superior, violent 'moderns').  That, too, is catchy and may attract media as well as professional attention.  But so far, it's not supported by any actual evidence beyond the imagination.

Again, right under our noses is the most fascinating question of all: how one human-like species completely displaced or replaced another.  The Neandertals disappeared, if they were indeed more than a 'racial' group at the time.  But before that, even more surprising if current views are close to correct, about 100,000 or fewer years ago, premoderns spread out of a spot somewhere near east Africa, and the long-established Homo erectus creatures disappeared.

It's difficult to imagine how these things happened, other than by accepting superficial, nearly evidence-free speculations.  But in any case, those questions seem more interesting, and are certainly more challenging, than always focusing on genes.

Wednesday, November 16, 2011

Who is this magical "third person" doing the science?

Writing in the third person and passive voice is traditionally the preferred style for publishing in many scientific journals.

But you’ll often come across a research article that’s written in the active, first person. I hadn’t seen one in a long time until yesterday and at the drop of the first “I,” alarms went off. After I calmed down I wondered, What’s so terrible about writing I and my?

I’ve decided that the answer is nothing.  It’s fine.

In fact, after you become accustomed to reading studies like this written in a much more modern active tone, articles written in the conventional way sound borderline ridiculous. That is, strict adherence to the passive voice and the strange use of "we" by a sole author...these practices can read pretty ridiculous.   

People complain about how awkward scientific writing is, so that’s one count against it. But beyond inconvenience there’s another problem with conventions of scientific writing.... their dehumanization of science. 

This third person passive "rule" was supposedly sparked by Francis Bacon to inject objectivity into scientific writing. But does it really do that? I’m inclined to think it's also likely to obscure objectivity by hiding weak or shoddy research in something larger, in something less accountable.

By sticking to the first person active perspective, you’re reminding yourself along with everyone who reads your study that a human with limits and biases performed it. This is fair, open, forthcoming, and can be very honest and humbling (depending on the author)… all these are beloved virtues of science and scientists. 

But transform an active, first person article into one with only a passive voice and the science reads like it's above and beyond mortal human business... as if an external force guided the author to do the research or that she's merely the recorder of a supernatural science project that was magically conjured in her laboratory.  
"The experiment performed itself!"
By dropping the third person passive voice, scientists can avoid giving the impression that their work transcends earthly constraints and that it is greater than what a human (or a mere first person) is capable of.

Scientists are science-doers, not science-whisperers and they should be able to report their work as objectively as possible, as close to reality as possible, not according to this or that grammatical preference.

I'm not sure very many MT readers (including myself, tomorrow) are going to agree with this post, but this is something I'm thinking about today.   

Tuesday, November 15, 2011

More unfilled promise from the sequencers

We are finally turning our attention to a piece that appeared in the Oct 19 issue of Nature, in which Stephen Baker describes the public health consequences of the sequencing 10 years ago of Salmonella enterica Typhi, the micro-organism that causes typhoid fever. Unfortunately, those consequences can be summed up in a single word; none. 

Baker is head of enteric infections at the Wellcome Trust Major Overseas Programme, Oxford University Clinical Research Unit, and coordinator of field and laboratory research in Ho Chi Minh City, Vietnam, and Kathmandu, Nepal.  He writes of the excitement and hope people felt when the sequence was complete -- and the subsequent disillusionment.  
Ten years ago, I was an author on the paper that announced the genome sequence of Salmonella enterica Typhi, the microorganism that causes typhoid fever (J. Parkhill et al. Nature 413, 848-852; 2001). The research was promoted with great fanfare, which declared that scientists were at a turning point in the fight against the disease. A decade on, we are no closer to a global solution.
Yes, more is understood about how the organism does its work, and why it only infects humans and so on, but as Baker says, "the promised concrete benefits — bespoke treatments, next-generation vaccines and low-cost diagnostics — have failed to materialize."

Why?  Because typhoid fever is as much a disease of poverty as it is of a particular organism.  And as such it is one of the neglected diseases that afflicts poor nations that don't have the will or the resources to commit to fighting disease.  Nor has it gotten the attention of foundations or corporations in the rich world that do have the resources to attack such problems, but that are focusing on diseases like malaria or tuberculosis or HIV, diseases that affect more people, or people in rich countries (though typhoid is estimated to affect 21.5 million people worldwide every year, so it is not a trivial problem).

Typhoid has been pretty much eliminated from the industrialized world -- it is doable, with attention to public health measures such as clean water and food and so on.  But that requires money and the will to address the problem.  Diseases of poverty are too often less scientific challenges than they are economic or political ones. 

And, as Baker says,
There are no advocacy groups for typhoid and other diseases of poverty similar to those that exist for HIV. Affected people and communities are not powerful constituencies. Decision-makers in endemic countries are typically drawn from the wealthier classes, and few have had typhoid fever, or have known someone who has died from the disease.
The hype and promise over the sequencing of Salmonella enterica Typhi mirrored the hype and promise over sequencing of anything a decade ago -- indeed, at its worst, the idea of immortality was seriously bandied about.  This story is a sober reminder that genetics has yielded results, but that those results are too often irrelevant to preventing disease.  This is as true for diseases of industrialized nations, heart disease, diabetes, cancers and so forth, which sequencing was supposed to eliminate, as it is for diseases of poverty.

Part of our belief system is that the world is so law-like that science and technology can understand everything, and fix everything.  This is still the legacy of the 400 year old Enlightenment period, the exciting idea that we control nature through 'knowledge. But that is at best a faint hope and is misleading in many ways.  Science and technology are, in a sense, 'safe': you can always apply more tech and get at least some sorts of 'results'.  It is easy to build institutions like universities and Pharma around this belief system.  Unfortunately, with more complex things like sociocultural environment the problems are simply different, and far less yielding.

Friday, November 11, 2011

The light shines

Just 2 days ago we posted a lament about our university, Penn State.  We've been beset by a scandal of large proportions, in which brand-protection by safe administrators allowed unsafe juveniles to be branded for life by experiences inflicted on them by a very prominent member of our football coaching staff, apparently over many years.

The famous, even revered head coach Joe Paterno, and our President (along with some others) were summarily fired by our Board of Trustees, when they learned of this.  Then, protecting the brand, and the iconic JoePa, some of our dregs (every university has dregs) rowdied in the streets, adding a new ugliness.

But Penn Staters have long been known for high levels of social activism, and the top-enders here recognized the need to go beyond football and coaches when it comes to something much more important, in this case, pedophilia.

Our best are already experienced and organized for doing good things, and in almost no time plans were made for ways to honor the victims, and remember this is not about football.  They organized a candle-light vigil in respect for and for protection of abused children.  Thousands of blue t-shirts were made, using the color already designated for such anti-abuse activities, and thousands--many thousands--of our students assembled on the lawn in front of the President's office.

However, hundreds of thousands of the Penn State family don't live here.  Do they care?  Take a look at this immediate response on the web and even more poignantly, look at this on YouTube.

And then a pep rally for the game was cancelled and the candle-light vigil drew an estimated 10,000 (of our best) students:

So, while the media do their job and find the many more shoes that are clearly likely to drop,  to reveal how much deeper the cover-up by the University and Second Mile were, a Phoenix of good may come of this.   We've lost our legendary coach and our public-face President, which seemed like a heavy blow, but by the time the stories are all out, perhaps nobody will be left to regret that.  Now Penn State needs to reform, regroup, reinvigorate ourselves as a serious university that happens to have a football team.  What a good example and pace we could set, if the moment is taken at its current flood!

Maybe we could start by not admitting any more goons, and reduce our class size so we could give the deserved attention to the students who actually want to be here to learn something.

Again, does science education matter?

Rachel Carson's Silent Spring, a book about the environmental consequences of widespread pesticide use, was published in the fall of 1962 -- 49 years ago. The book kept coming up, in talks I was hearing, in conversation, but I had never read it. I've just done so -- the chemical names may have changed (e.g., largely as a result of this book, DDT is no longer in widespread use in this country), but the issues and the fact of polarized corporate interest vs environmental concern remain the same.  In many ways, this book could have been written today.

Book-of-the-Month-Club selection
Silent Spring is about the consequences of widespread use of pesticides, yes, but it's more than that.  Carson describes the effects of DDT, the insecticide most people think of when they think of this book, but also of dieldrin, parathion, heptachlor, malathion and other compounds much stronger than DDT, on all forms of life, particularly when they are used with the kind of abandon they were being used with at the time. 

Carson documents the immediate effects of these chemicals, as well as their effects once they get into the water and the food chain.  She discusses the problem of growing pesticide resistance.  And I think significantly, she includes examples of how to deal with pest infestations either without pesticides at all (Dutch elm disease can be controlled with good sanitation and the elimination of infected limbs, e.g.) or by limited, targeted application.   She offers a view of life on earth as interconnected, and was the first to give widespread attention to the downstream consequences of disturbing an ecosystem.

The book was the subject of intense vilification by the chemical industry at the time of its publication.  Indeed, the author herself was vilified as well at the time -- and continues to be to this day.

As Peter Matthiessen wrote in an essay in Time magazine in 1999
Silent Spring, serialized in the New Yorker in June 1962, gored corporate oxen all over the country. Even before publication, Carson was violently assailed by threats of lawsuits and derision, including suggestions that this meticulous scientist was a "hysterical woman" unqualified to write such a book. A huge counterattack was organized and led by Monsanto, Velsicol, American Cyanamid--indeed, the whole chemical industry--duly supported by the Agriculture Department as well as the more cautious in the media.
Mattheissen further writes that eventually the chemical companies realized that their response to the book, rather than discrediting Carson, was only bringing attention to the issues, so they stopped.  Indeed, Silent Spring became a runaway bestseller, and launched the environmental movement.

Norman Borlaug won the 1970 Nobel Peace Prize for his work on increasing wheat yield in Mexico, as well as horticultural techniques that eventually led to the Green Revolution.  He is credited with having saved 1 billion lives from starvation.  But the Green Revolution relied on widespread use of petroleum-based fertilizers and pesticides, and increased the practice of monocropping around the globe.  Thus, it has been criticized as unsustainable, and in fact the cause of decreased food security in much of the world. Borlaug himself was highly critical of Silent Spring, calling environmentalists hysterical, fear-provoking, irresponsible, having had their genesis in the 'half-science half-fiction novel' Silent Spring. Borlaug is a hero of the American right, and Carson the devil incarnate, much of this because of a campaign by right wing radio host Rush Limbaugh.  Al Gore, who won the Nobel Peace Prize, on the other hand, credits Silent Spring with raising his awareness of environmental issues. 

Paul Krugman in the New York Times wrote a recent column about solar power, mentioning along the way the currently contentious issue of hydrolic fracturing (fracking), as newly adapted to the release of the methane gas now sequestered in the Marcellus Shale that extends across New York, Pennsylvania and Ohio. The industry insists that their methods are environmentally safe.  If you don't believe that, you're surely unAmerican.  As Krugman writes:
...the industry-backed organization declares that “there are only two sides in the debate: those who want our oil and natural resources developed in a safe and responsible way; and those who don’t want our oil and natural gas resources developed at all."
But if Silent Spring launched the environmental movement, and the fact of the interconnectedness of all of life is now almost a cliché, why is the book still so timely?  And what's with the widespread defense of DDT from the right wing?  Why is the issue of GM crops still so polarizing, with the likes of chemical companies such as Monsanto still on one side and environmentalists on the other, with little to no common ground in between?  Why the polarized debate over consequences of fracking?  Indeed, why is it that if you know someone's position on these kinds of questions, or on Rachel Carson herself, you can predict their politics?  Why isn't science resolving these issues?

Wednesday, November 9, 2011

What is Penn State?

Dear MT readers:
We have both been here at Penn State University for 26 years.  When we got here, Joe Paterno had already become legendary and had been our football coach for nearly 20 years.  He had already long and publicly stood for doing the right thing.  Students that do not cheat, that honor themselves and the university, and that graduate, even if not all in high-pressure majors.  In many, many ways that ideal is just what he achieved, and with few precedents anywhere in the United States.

Legions of Penn State students (and faculty) have acted with pride and dignity, and for things that were right, and just. Our graduates were recently shown to be among the most active in service to their communities, and the like.

But it's well known, and sad, that another major part of our reputation, and what draws too many students here, are sports well out of control, and a culture of riotous drunken behavior. 

That is one of a number of issues that Penn State did not confront.  It has been an insular university, where 'loyalty' means promoting from within, and unfortunately, circling the wagons when it comes to controversy.  Loyalty to our image--our 'brand'--has taken far too much precedence when it comes to dealing with things like the drinking party culture, the excessive stress on sports and similarly superficial things, and the scandal that is now engulfing us.  Pleasing undergraduate students and a fine Nittany Nation image, have taken precedence over academic rigor.

This is not just about hypocritical judgments in hind-sight on our part.  Probably there are more facts to emerge, and it seems unlikely that they will be in any way exculpatory.  Those in charge went through a few of the motions, but did not press to see that reports of sexual abuse were followed up.  More probably knew more, long ago even, about the sex abuse than has come to light so far.  Who knows--perhaps innocently, conveniently, administrators just hoped it would go away.  Perhaps no one wanted, or dared, to be the one who brought negative light on 'dear old State'.

Our President was always someone for whom it seemed that appearance and spin consistently took precedence over substance.  He was very good for us in many ways, raised a lot money and built many buildings.  But one can see how his over-riding need for smiling before cameras is what led in many ways to where we are now.  Had he been a person more about substance than appearance, even a whiff of child abuse would have, or should have, led to action.  Instead, a mentality of covering up rather than taking stands on difficult issues, of which there are many examples, allowed things to build to a point when they could no longer be contained.  We are, properly, paying the price for that today.

On the other hand, we can assure any MT readers who many not know it, that there are many, many very fine, good, intelligent, delightful, thoughtful and decent people here--among our students, our faculty, and our staff.  We hope this will not be overlooked.

We are embarrassed, but more importantly, we are saddened for the boys who were abused, by a person of power in the storied Penn State football program.  The abuses went on for so long, many here on campus, that many of these boys are now men.  The fact of their pain should not be forgotten.

But we will move onward.  Hopefully, a new administration will pay more attention to the real issues facing universities, ours as well as others, and less to football, partying, and photo-ops. 

It is a moment, and we hope we'll seize it!

We owe that to our many, many very fine, good, intelligent, delightful, thoughtful and decent people here--among our students, our faculty, and our staff.

Dead certain?

Here's an astonshing bit of major discovery, and one might say of assertion of certainty:  Investigators have found that 'near death' experiences are 'all in the mind.'   The evidence shows, the investigators say, that normal brain activities--essentially, the imagination--can construct imagery that people who recover from what they sometimes claim are near-death states report that they have seen.

The investigators seem very sure of themselves, but how can they be?  One can do all the brain testing one wants, but that simply cannot answer the question.

The real question is whether there even is a question here.  The evidence for claims of life after death has always been largely imaginary.  From claims that a Pope performed miracles, to sacred images performing miracles, to claims of answered prayers, there really is no serious evidence--from a scientific point of view.  From that viewpoint, such claims are beliefs rather than evidence.

One can put it another way: science is about things that are part of the material world as we know it.  Evolution may have taken place in the past, where we can't observe it directly, but we do have wholly consistent material evidence supporting ideas about the past and with predictive power about what we will find in future evidence in terms of general pattern and so on.  We can't see molecules in the usual sense, but instrumentation allows us to study them.  Though our usual instrumentation cannot detect it directly, even dark matter can be studied indirectly, through normal materially-derived means.  That is what 'counts' for science.

We can't see souls, angels, or God's will.  In a sense, if we could they would not be souls or God, as defined in the western conception of a world of eternal immaterial spirits.  The afterlife is basically by definition beyond the reach (or purview?) of science.  But if claims were true that the spiritual universe affects our real one, they necessarily involve material intermediary.  A miraculous event has to occur in our physical world.  So, obviously, near-death visions have to occur in the brain, and could in principle be studied in terms of brain activity observed when the subject reports talking with Jesus.  From a scientific point of view, one is likely to deny the literal truth of such reports, but observing brain activity is in now way whatever relevant to that truth.  It is compatible with the brain, somehow that we don't understand, having the ability to receive and respond to 'signals' of some sort from the immaterial world.  It's just that we have no evidence to support such a speculation that bears scientific scrutiny.

Scientists may be--indeed may be overwhelmingly likely to be--totally right to think that such reports are due to internally structured imagination rather than externally or immaterially caused.  But to claim that they know this from material evidence is a presumptive claim because the kind of evidence cited is simply irrelevant to the inference being drawn.

Tuesday, November 8, 2011

The Bug Eaters

A week or so ago we provided a much-needed explanation for the evolutionary morphology of the recently reported roughly 2 million year old human ancestral fossil, named Australopithecus erotimanis.  There, the key fact to explain was the human hand.  How did it evolve?  We noted that the explanation of the hand as an adaptation for tool use was not undermined by the absence of tools at the site.  Our alternative explanation is too risque to repeat here, but had a comparable amount of overwhelming evidence from the site.  For details, you'll have to read the post and its comments, and a  follow-up a couple of posts later.

The point of that post was to note how free-lance speculation about what a trait evolved 'for' is all too rampant in evolutionary biology, taken too seriously, and often without support or when it is very easy to develop (conjure up?) comparably plausible alternative stories.

Now, recently in the journal PNAS , a similar kind of story has appeared, concerning the evolutionary consequences of cooking and otherwise preprocessing food.  Well, they say it better than we can, so here's the abstract to the paper:
Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.
Apparently we began keeping the home fires burning way back at the beginning.

Campfire, Wikimedia Commons
Now it is true that no Agas have been found at any Australopithecine camp-sites, which were established even before Sterno and Coleman camp-stoves were developed.  Indeed, little if any serious evidence of the controlled use of fire of any kind has been found that far back.  Still, the suggested scenario is that reduced molar (back tooth) size and presumed decreased time spent feeding combine to show that our ancestors must have been cooking, because to get as much energy as was needed they had to chow down a lot of high-octane food, such as meat which cooking makes more digestible.  Yet they took (and we their descendants take) precious little time doing so, assuming as the authors did that modern hunter-gatherer patterns are a good model for 2 million years ago.

If this is an adaptive explanation, that is, based on natural selection, for these findings, it implies that some individuals in the Australopithecine group caught and gobbled enough meat that they had a systematic reproductive fitness advantage because of their diet. We say 'some' individuals because if we had said the whole group we'd have crossed the line into the Never-Never Land of group selection.  So the scenario had to be that sometime in pre-history George and Georgina Goomp, lonely and childless, gazed enviously at Don and Donna Drmilg, surrounded by their bevy of plump-bellied children, chewing dreamily on medium-well WildeBurgers. 

Such stories conjure up the usual robust idea of reliance on meat, which presumably they could get because of their evolving mental powers, even though it is not clear that they had the wherewithal not just to hunt and scratch matches, but to hunt enough, regularly enough, to nourish on tasty viands often enough to have had such a reproductive advantage. Again, as with our previous commentary on the erotimanii, the argument seems to have been bolstered (somehow) by the absence of actual evidence for cooking at the time. The glamorous Man the Hunter is a hard canard to give up.  Never mind the details!

Now we're having a bit of sport here, because this is a blog, not a journal.  There is no doubt that humans do differ from other primates and our dentition, body size, and other traits are clearly part of that.  The authors of the PNAS paper did a sophisticated job of analyzing the statistical strength and consistency of patterns of species  relationships among us and our primate relatives, with patterns of morphology and feeding times.  The data, methods, and analysis are clear.  But whether our morphology inherently had to do with feeding duration or whether even if it did it was the result of natural selection, and whether that selection had anything to do with cooking are less than obvious.  And that is where speculation or even pure story-telling begins.

Part of the evidence is the presence of small molar (back, grinding) teeth relative to the metabolic demands of a brain thinking away at a fevered pitch.  But it is not enough to say that the Australopithecines have small teeth because they did not need big teeth to mash up tough plant material.  There had to be an advantage to small molars.  But what could that have been?

The subtle underlying idea is that small teeth were not sufficient for slow, grinding up of tough, low-energy plant foods that their hunting-capable, fire-discovering brains demanded.  But the Australopithecines' brains weren't all that different from the brains of the more or less contemporary Hominids, who had big teeth. They did, it's true, eventually become extinct while we didn't.  The common evolutionary theory is that everything must have got here by natural selection, and something not used will eventually mutate away because there would be no selective pressure to maintain the metabolic cost of keeping it.  Thus, as the frying pan heated up, the teeth became smaller.  That's the micromanagement view of natural selection, and may seem sensible on the face of it.  We're not partial to that view, but even on its own terms there are problems.

Relaxation of selection would normally be expected to yield mutantly decrepit or lost structures, not the smaller but otherwise normal, fully functional, well-formed teeth that the Australopithecines had (and we have).  The strict selectionistic idea is that maintaining such structures costs the individual and if it doesn't need them  there's no selection pressure to keep them around, and mutations in tooth-building genes will eventually destroy them.  There's not much evidence for mutational chaos in hominin molars as they evolved smallness, and in fact there are developmental genetic reasons why the dentition can become smaller without being degraded.  Be that as it may, why wouldn't it be worth the trivial metabolic cost of making big molars, for the rainy day when you couldn't make a fire or the hunt was a bust?  At least you could still munch on tubers.  Once developed, teeth make very few energetic demands on the individual.  That fall-back ability might, in fact, seem like an advantage! 

Given these kinds of problems one can ask--one always should ask--whether some alternative hypothesis might provide as plausible an evolutionary explanation, and of course MT is right here, as always, with the answer:

Our ancestors evolved by eating insects!

We confess we've never been to Africa, but it is likely that even 2 million years ago arthropods were everywhere, all year long.  There are countless types of catchable insects for a varied diet.  They're a mighty source of protein, have a nice snappy crunch to them, and if you don't get grossed out by wing bits between your teeth, they can satisfy the hungriest appetite.   And, remarkably, while they are delicious fried and chocolate-covered, you don't have to cook them!

Our argument here is strongly bolstered, as in the case of our earlier explanation of the Australopithecine hand, by the complete lack of actual evidence for insectivory, which by precedent with other evolutionary stories seems to lock down our scenario tight as a moth's cocoon.  It's not as romantic as heroically bringing down a giraffe for dinner, and since we in the west forbear the eating of these hexapods and octopods (much less centipedes!), it's easy simply to assume that our forebears didn't either.  But there is little if any substance to such prejudicial judgments. And we haven't even mentioned nice, meaty worms!

Circumstantial evidence can certainly be used to formulate hypotheses in science, and there are precedents, such as the theoretical prediction of the existence of Neptune from oddities in the orbit of Uranus, and one could argue that the kinds of Platonic shadow-evidence we have in the fossil record may be analogous.  Fair enough in principle, and the cooking scenario is certainly a plausible one.  But it's not the only one, and we think it's safe to say that the the theoretical basis from which to extrapolate from circumstantial evidence in evolutionary anthropology is not quite as rigorous as physics.

Again, the point here is not when our ancestors began to cook their bacon, which certainly happened long ago, but the widespread lack of restraint before not only cooking up simple Darwinian explanations for complex traits, nor in marketing them with blaring trumpets to the hungry media who are always ready to splash a simple story to the eager public.  This kind of evolutionary story-making is not new to our field by any means.  Perhaps it's typical.  It's a (legal) way to have fun and  make a living.  But it shouldn't be portrayed as science. 

But on this note, it's time for another episode of....


The Misadventures of P'Qeeb:  "Me smell bacon!"

Sniff.  Sniff.  P'Qeeb's ready nose (see earlier related comments here) caught his attention.
"Me smell bacon!"  he said.
"Can't be bacon, " replied Ugmup confidently, "pigs not yet evolved."
"Me smell smoke, too," retorted P'Qeeb, irritated.

The two hulking paranthropine Hominids and their slow-witted cousin Slthmch crept carefully through the trees towards the enticing aroma.  Then they were stopped in their tracks by a strange, loud buzzing sound.

"Sound like a bee--a very big bee," said Ugmup.
"Youp!  Must have very big stinger!!  Me go home," said Sthlmch, horrified at the images it conjured in his limited mind. 

But the buzzing stopped, and all was quiet again.  Even Sthlmch felt safer, and they moved closer to the  puffs of smoke they could see drifting through breaks in the trees.  With each step the savory aroma also grew stronger, drawing them forward.  Finally, keeping  well hidden, they were able to see the Australopithecines' camp.  The males were circled around a fire, the females close behind them.

"That the smallest forest fire me ever see," said Sthlmch, "a forest fire on the ground!"  And they realized to their surprise that this forest fire was burning not whole standing trees but only smaller branches.  Who had ever heard of such a thing?

Soon Ugmup found the explanation: laying on the ground near the fire he spotted a ring of sharp small objects wrapped along an oblong piece of shiny material, all connected to a lumpy end that seemed to have a handle on it.  Ugmup nodded knowingly, and pointed to this object, but could do no more, because their primitive language had not yet got a word for chain-saw.  (but that at least explained the buzzing they'd heard)

The  males squatting closely round this small forest-fire-on-the-ground each held a stick on which were impaled a few greenish globular objects sizzling just above the flames.

"What they doing?" asked P'Qeeb.

As they gazed in curious amazement at the tiny forest-fire, Ugmup's attention was naturally drawn to the females--because of their big, beautiful pair of .... hands.  He saw that they were busily placing green leaves on something flat and whitish, and adding bright red juicy objects atop the leaves.

This previously unknown sight naturally confused Ugmup until he was struck by a flash of comprehension.  He looked at the sun blazing high in the African sky.  "It lunch time," he exclaimed, "and they making BLTs--beetle, lettuce, and tomato sandwiches!"*

"Right!" Sthlmch and P'Qeeb in whispered chorus. But although the sight and smell made them salivate desperately,  they knew they were not powerful enough to attack the more advanced Australopithecines and seize their savory meal.  So, rather than remain frustrated in hopeless temptation, they turned to wend their way back through the woodland, all as quiet, and as contemplatively as was possible with their rudimentary brains.

"Me still hungry....but very sad," remarked P'Qeeb after a while, as they walked along.  His hairy stomach gurgled loudly.
"Why you sad?" asked Ugmup.
"Why P'Qeeba no make me forest-fire-on-ground?  Why she no make me BLT?" asked P'Qeeb, a tear running down his  cheek.
"Because....because she....she no have...." started Ugmup sympathetically.  But then he stopped.  He understood the reason, but could explain it no further because, as we said, they had no word for 'chain-saw'.

So they trod somberly through the trees and out into the savannah.

*NOTE:  This is obviously only an allegory, not a true story.  Tomatoes didn't get to Africa for another 2 million years.

Monday, November 7, 2011

An only partly deserved bad name?

There's been another blockbuster event for Big Pharma.  From the story in the New York Times:
The British drug company GlaxoSmithKline said Thursday that it has agreed to pay $3 billion to settle United States government civil and criminal investigations into its sales practices. 
The cases against GlaxoSmithKline include illegal marketing of Avandia, a diabetes drug that was severely restricted last year after it was linked to heart risks. Company whistleblowers and federal prosecutors said the company had paid doctors and manipulated medical research to promote the drug.
Now, Big Pharma is avidly seeking blockbusters.  As patents end and competition grows for previous bonanzas like statins and Viagra, they want their next ticket to Easy Street.  One sublime object of their desires is male baldness, where if a pill-based cure is found will put a top rather than toupee on a huge market, so to speak.  Unfortunately, they seem to want to make it to Easy Street quite a bit too much: this story is of a $3 billion penalty for misconduct in marketing a particular drug, Avandia, that was aimed at diabetes but also seemed to take aim at the heart in ways the company purportedly did not come clean about.

This blockbuster fine rather than find, is not what GlaxoSmithKline wanted. But it's not the first of its kind, and one can only hope that their avarice and profit margins are not such that they think this is a net-profitable trade-off.  The stories are becoming more numerous of conflicts of interest, incomplete revealing of data, and so on, associated with drugs, reports of drug trials, and other lucrative aspects of medical practice.  And then there is the publicity hype machine in which funders, researchers, journals, and public media are co-conspirators.

But there is a more serious side to this than the loss of Big Bucks by a few companies.  Big Fraud can give a Big Bad Name to all of science.  Now all of science is vulnerable to vanity, and should be monitored and held to account much more than has been the case.  But no matter how venal, or how implicitly tied to constraining beliefs, that science is, and no matter the yielding to temptation to dissemble or massage results to favor the flow of fame and finances, science is on the whole a very clean game.  Outright cheating is very rare, and especially regarding Big Findings: they may  more often be wrong or over-stated than right, but very strong claims in major research will quickly get the scrutiny it deserves.  That makes real fakery hard to get away with (though there was a story just last week about a prominent Dutch social psychologist who managed to get away with it for some time, until his grad students blew the whistle, and he was fired).  And scientists can't withstand the fines, much less the censure, that would follow.

So no matter whether or not there is the kind of restraint and oversight that could liberate science from some of its problems, of which we often take note, the overall honorable nature of this profession should not be overshadowed by the genuine Big Crooks that do occasionally pop-up, especially in the corporate sector (professor crooks are usually very small time).

Friday, November 4, 2011

GWASpirin! How GWASpirating!

There have been several studies of the beneficial effects of regular aspirin dose on diseases nobody thought aspirin had any relationship to.  This is not about pain, but about heart disease and, in recent years, even cancer.  A new study furthers this, by finding that people at risk of developing a collection of cancers, a trait called Lynch Syndrome after Henry Lynch, a very prominent expert in cancer at  Creighton University in Omaha, Nebraska.

Sufferers from this syndrome are vulnerable to form cancers in various tissues, including the large bowel.  It affects various tissues because at least in the major form of the syndrome, there is a mutation in a gene coding for a protein that protects DNA from being miscopied when replicated before cell division.  When the gene mis-fires, the DNA can accumulate mistakes.  In the billions of cells, some of them rapidly dividing in organs like the intestine, all it takes is one lineage of these cells to accumulate an unlucky set of genes mutated in this way (perhaps with risk variants that were inherited), to turn that cell into a tumor that can grow to become life-threatening.

Now, with all the GWAS in the world, it is ironic that the new findings--that replicated some earlier ones--find that it is good old fever-fighting ache-reducing aspirin that is responsible.  It is not clear why aspirin should have this effect, since aspirin in principle doesn't know which cell has got too many mutations in it.  The aspirin effect must be generic in relation to dividing cells, or to (say) inflammation in such tissues, to slow their growth or target them for destruction by the immune system.  Someone may know why this effect works, but we don't.

However, aspirin does seem to recognize cell components called platelets that accumulate around damaged cells.  When there are too many of these, aspirin helps clear things.  Of course, when clotting might be important, if you've been taking aspirin you may not form clots when you want to.  Perhaps cancers, which are masses that grow somewhat chaotically--but need blood supply and send out signals that stimulate blood vessel growth to 'feed' the tumor cells--may do this irregularly enough that aspirin somehow causes local hemorrhage or something of that sort, that starves or slows the tumor cells or tumor growth.  We're guessing, and doubtlessly some sources will have better explanations.  But one thing is that it was unexpected until a few years ago.

So, if whole genome studies are too often GWASpirating, here's a case in point:  A little GWAspirin, rather than multimillion dollar study, probably does much more preventive good than the amount of risk associated with by far most of the genes identified by GWAS to date.

Now, interestingly, genomewide association studies are done because one has insufficient knowledge of the cause of some disease or normal trait, so we simply search the whole genome to find those genes that at least contribute.  Large studies have typically found few statistically significant effects, and those that are found usually have made only very small individual contributions to risk.  That's why many feel that by now we know the story and GWAS have little future potential to generate GWhiz! findings, and the funds should be spent in more genetically focused ways.

In this case, however, hypothesis-based or focused genetic science probably would not have found the aspirin effect.  That is the serendipity side of science, and we're all fortunate that it works from time to time.

Whether the aspirin effect turns out to be as it seems, or fades for some reason upon further knowledge, only time will tell.  There are cons as well as pros: the effect is small, if any, on those not already at high risk of these cancers, and aspirin is a risk factor for ulcer and stroke, or even perhaps excessive bleeding in accidents and injuries.  So the balance needs to be evaluated and it will be difficult for any neutral party to identify it.  But at least here, rather than a zillion-dollar study, the cheapest of medicines goes far further.  What a headache for those dedicated to GWAS!