Monday, October 8, 2018

Evolution, to Engels--and a kind of lesson for us all?

We tend to think of Friedrich Engels as Karl Marx's pal, co-author, supporter--and financial benefactor.  That's all true.  But he was also perhaps a better synthesizer of ideas, and certainly a more approachable author.  A core aspect of their economic idea was that, through historic processes, the nature of societies evolve, from simple states of our early human forebears ultimately to come to rest as communism.  I am no expert but I think that since there would then (they thought) no longer be opposition or competition, history would, so to speak, come to an end.  At least the uninterrupted history of unfairness, conflict, strife, and governments that oppressed their people. A good discussion of Engels' life can be heard in a recent broadcast/podcast presented recently on BBC Radio.

That Beeb program went over many things about Engels that are familiar to anthropologists, among others.  But it ended by referring to a work I'd not known of, a partly unfinished book on science called The Dialectics of Nature, which is available on line or as a pdf.  The latter has an Introduction by JBS Haldane, one of the early 20th century's founding evolutionary geneticists, and a political leftist.

Image result for friedrich engels
Engels. (One of many versions on the web)

Engels discusses the various sciences as existed at the time (1883).  Haldane points out some errors that were known by his (Haldane's) time, but Engels' book is a surprisingly deep, broad review of science at his time.  I do not know how Engels knew so much science, but apparently he did.

Although Engels never completed it, the book was written only about 25 years after Darwin's Origin of Species, which to Engels was highly relevant to his views on society.  But he went much further! He viewed essentially everything, not just human society, as evolving phenomena.  While with various errors based on what was known at the time, he recognized astronomical change, geological evolution, and biological evolution as manifestations of the fundamental idea that things cosmic were not Created and thereafter remaining static, as prevailing biblically-derived views generally held, but had beginnings, and then changed.  Engels applied his ideas to inanimate physical phenomena as they were then understood, as well as to life itself. In essence, his view was that everything is about change, with human society as just another instance.

Engels was looking for what we might call universal 'laws', in this case concerning how systems change.  This would be a major challenge, by science, to the theologically based idea that once Created, worldly things were mainly constant.  Engels noted that the classic Greeks had had a more 'modern' and correct view of the dynamics of existence than western Europe had developed under the reign of the Church.

Engles' book shows how grand thinking had led to, or could be made consistent with, the social thinking by which Marx and Engels could believe that sociocultural evolution was similarly non-static.  If so, they claimed to see how societal dynamics would lead to future states in which the rather cruel, relatively primitive nature of nation states in his time would evolve to a fairer, more egalitarian kind of society.  But Dialectics of Nature shows that Engles was thinking very broadly and 'scientifically', in the sense of trying to account for things not just in terms of opinions or wishes, but of natural forces, and the resulting dynamics of change.  He wasn't the only one in his time who thought that the idea of an evolutionary process enabled one to predict its outcome--as seemed to be possible in physics and chemistry.

I am no Engels scholar, and I had no idea he was so knowledgeable about science as it stood at his time, nor that the idea of evolutionary change that he and Marx applied to society was, in a sense, based on the finding, in their view, of similar kinds of change in the physical cosmos.  This in a sense, conveniently made the extension of the theory to society seem quite logical, or perhaps even obvious, and as noted above, many were speculating in similar ways.  Marx and Engels scholars must be aware of this, but when I was exposed to these theories as an anthropology graduate student decades ago, I did not know of this connection between social and physical dynamics and evolution.

These alleged connections or similarities do not make the Marxist conclusions 'true', in the sense of scientific truth.  The idea that geology and species evolve may seem similar to the idea that societal structures evolve.  But just because two areas have some sort of similarity, or change over time and space, does not mean they have the same causes.  Human culture involves the physical aspects of a society's environment, but culture is largely or mainly about human interactions, beliefs, kinship, and so on.  There is no necessary physically causal or deep connection between that and species evolution or the growth and erosion of mountain ranges.  A planetary orbit, a hula hoop, and an orange are all more or less 'round', but that does not establish connections between them.

At the same time, Engels worked at the height, one might say, of the idea that there were universal 'laws of Nature'.  Darwin informally likened evolution to planetary motion, with law-like properties, and in some of his writing (e.g., about barnacles) he seems to have believed in a kind of selective inevitability--some species being, essentially, on the way to a terminal end found in related species (terminal, at least, as Darwin saw them in his time).  This may not be as benighted as it may seem.  Biologists still debate the question of what would happen if you could 'rewind the tape' of evolution, and start over.  Some have argued that you'd get the same result.  Others vigorously oppose this sort of belief in predictable destiny.

Given the ambience of science in the 19th century, and in the legacy of the 'Enlightenment' period in Europe only a century or two before, it is not surprising that Engels, wanting society also to be constrained by 'laws' or forces, and hence to be predictable if not leading to inevitable causal effects, would see parallels in the physical world.  Many others in that general time period in Europe had similar law-like ideas about societies. It is, at the very least, interesting that Engles tried to make his social ideas be as reflective of natural laws as are the orbits of planets.

What about us, today?
It is easy to look back and see what was 'in the air' in some past time, and how it influenced people, even across a spectrum of interest areas.  In this case, evolutionary concepts spanned the physical, biological, and social sciences.  We can see how very clever, insightful people were influenced by the ambient ideas.

So it's easy to look back and discern common themes, about which each person invoking them thought he was having specific, original insights.  But that's us looking back at them.  What about us in our own time?  How much might we, today, be affected by prevailing views--in scientific or societal affairs--that are 'in the air' but may not be as widely applicable as some argue that they are?  How many of our prevailing views, that we of course think of as modern and better than the more primitive ones of the past, are similarly just part of the ambience of our times, that will be viewed with patronizing smiles at our naiveté?  Does going with the flow, so to speak, of current tides make us see more deeply than our forebears--and how much is it just that we see things differently?

How can we know?

Saturday, October 6, 2018

And yet it moves....our GWAScopes and Galileo's lesson on reality

In 1633, Galileo Galilei was forced to recant before the Pope his ideas about the movement of the Earth, or else to face the most awful penalty.  As I understand the story, he did recant....but after leaving the Cathedral, he stomped his foot on the ground, and declared "And yet it moves!"  For various reasons, usually reflecting their own selfish vested interests, the powers that be in human society frequently stifle unwelcome truths, truths that would threaten their privileged well-being.  It was nothing new in Galileo's time--and it's still prevalent today.


Galileo: see Wikipedia "And yet it moves"
All human endeavors are in some ways captives of current modes of thinking--world-views, beliefs,  power and economic structures, levels of knowledge, and explanatory frameworks.  Religions and social systems often, or perhaps typically, constrain thinking. They provide comforting answers and explanations, and people feel threatened by those not adhering, not like us in their views.  The rejection of heresy applies far beyond formal religion.  Dissenters or non-believers are part of 'them' rather than 'us', a potential threat, and it is thus common if not natural to distrust, exclude, or even persecute them.

At the same time, the world is as the world really is, especially when it comes to the physical Nature.  And that is the subject of science and scientific knowledge.  We are always limited by current knowledge, of course, and history has shown how deeply that can depend on technology, as Galileo's experience with the telescope exemplifies.

When you look through a telescope . . . . 
In Galileo's time, it was generally thought or perhaps believed is a better word, that the cosmos was God's creation as known by biblical authority.  It was created in the proverbial Genesis way, and the earth--with we humans on it--was the special center of that creation.  The crystal spheres bearing the stars and planets, circled around and ennobled us with their divine light.  In the west, at least, this was not just the view, it was what had (with few exceptions) seemed right since the ancients.

But knowledge is often, if not perhaps always, limited by our senses, and they in turn are limited by our sensory technology.  Here, the classical example is the invention of the telescope, and eventually, what that cranky thinker Galileo saw through it.  Before his time, we had we had our naked eyes to see the sun move, and the stars seemed quite plausibly to be crystal spheres bearing twinkles of light, rotating around us.

If you don't know the story, Wikipedia or many other sources can be consulted. But it was dramatic!  Galileo's experience taught science a revolutionary lesson about reality vs myth and, very directly, about the importance of technology in our understanding of the world we live in.

The lesson from Galileo was that when you look through a telescope you are supposed to change your mind about what is out there in Nature.  The telescope lets you see what's really there--even if it's not what you wanted to see, or thought you'd see, or would be most convenient for you to see.


Galileo's telescope (imagined).  source: news.nationalgeographic.com
From Mendel's eyes to ours
Ever since antiquity, plant and animal breeders empirically knew about inheritance, that is, about the physical similarities between parents and offspring.  Choose parents with the most desirable traits, and their offspring will have those traits, at least, so to speak, on average.  But how does that work?

Mendel heard lectures in Vienna that gave him some notion of the particulate nature of matter.  When, in trying to improve agricultural yields, he noticed discrete differences, he decided do test their nature in pea plants which he knew about and were manageable subjects of experiments to understand the Molecular Laws of Life (my phrase, not his).

Analogies are never perfect, but we might say that Mendel's picking discrete, manageable traits was like pre-Newtonians looking at stars but not at what controlled their motion.  Mendel got an idea of how parents and offspring could resemble each other in distinct traits.   In a similar way that a telescope was the instrument that allowed Galileo to see the cosmos better, and do more observing than guessing, geneticists got their Galilean equivalent, in genomewide mapping (GWAS), which allowed us to do less guessing about inheritance and to see it better.  We got our GWAScope!

But what have we done with our new toy?   We have been mesmerized by gene-gazing.  Like Galileo's contemporaries who, finally accepting that what he saw really was there and not just an artifact of the new instrument, gazed through their telescopes and listed off this and that finding, we are on a grand scale just enumerating, enumerating, and enumerating.  We even boast about it.  We build our careers on it.

That me-too effort is not surprising nor unprecedented.  But it is also become what Kuhn called 'normal science'.  It is butting our heads upon a wall.  It is doing more and more of the same, without realizing that what we see is what's there, but we're not explaining it.  From early in the 20th century we had quantitative genetics theory--the theory that agricultural breeders have used in formal ways for that century, making traditional breeding that had been around since the discovery of agriculture, more formalized and empirically rigorous.  But we didn't have the direct genetic 'proof' that the theory was correct.  Now we do, and we have it in spades.

We are spinning wheels and spending wealth on simple gene-gazing.  It's time, it's high time, for some new insight to take us beyond what our GWAScopes can see, digesting and understanding what our gene-gazing has clearly shown.

Unfortunately, at present we have an 'omics Establishment that is as entrenched, for reasons we've often discussed here on MT, as the Church was for explanations of Truth in Galileo's time.  It is now time for us to go beyond gene-gazing.  GWAScopes have given us the insight--but who will have the insight to lead the way?

Thursday, October 4, 2018

Processed meat? Really? How to process epidemiological news

So this week's Big Story in health is that processed meat is a risk for breast cancer.  A study has been published that finds it so.....so it must be true, right?  After all, it's on CNN and in some research report.  Well, read even CNN's headliner story and you'll see the caveats, the admissions, softened of course, that the excess risk isn't that great, but, at least, that the past studies have been 'inconsistent'.
Yummy poison!!  source: from the web, at Static.zoonar.com
Of course, with this sort of 'research' the weak associations with some named risk factors can easily be correlated with who knows how many other behavioral or other factors, and even if researchers tried to winnow them out, it is obvious that it's a guessing game.  Too many aspects of our lives are unreported, unknown, or correlated.  This is why week after week, it seems, do-this or don't-do-that stories hit the headlines.  If you believe them, well, I guess you should stop eating bacon.....until next week when some story will say that bacon prevents some disease or other.

Why breast cancer, by the way?  Why not intestinal or many other cancers?  Why, if even the current story refers to past results as being 'inconsistent' do we assume this one's right and they, or some of them, were wrong?  Could it be that this is because investigators want attention, journalists need news stories, and so on?

Why, by the way, is it always things that are actually pleasurable to eat that end up in these stories?  Why is it never cauliflower, or rhubarb, or squash?  Why coffee and not hibiscus tea?  Could western notions of sin have anything to do with the design of the studies themselves?

But what about, say, protective effects?
Of course, the headlines are always about the  nasty diseases to which anything fun, like a juicy bacon sandwich, not to mention alcohol, coffee, cookies, and so on seems to condemn us.  This makes for 'news', even if the past studies have been 'inconsistent' and therefore (it seems) we can believe this new one.

However, maybe eating bacon sandwiches has beneficial effects that don't make the headlines.  Maybe they protect us from hives, antisocial or even criminal behavior, raise our IQ, or get fewer toothaches.  Who could look for all those things, when they're busy trying to find bad things that bacon sandwiches cause?  Have investigators of this sort of behavioral exposure asked whether bacon and, say, beer raise job performance, add to longevity, or (heavens!) improve one's sex life?  Are these studies, essentially, about bad outcomes from things we enjoy?  Is that, in fact, a subtle, indirect effect of the Protestant ethic or something like that?  Of the urge to find bad things in these studies because they're paid for by NIH and done by people in medical schools?

The serious question
There are the pragmatic, self-interested aspects to these stories, and indeed even to the publication of the papers in proper journals.  If they disagree with previous work on the purportedly same subject, they get new headlines, when they should perhaps not be published without explicitly addressing the disagreement in real detail, as the main point of the work--rather than the subtle implication that now, finally, these new authors have got it right.  Or at least, they should not headline their findings.  Or something!

Instead, news sells, and thus we build a legacy of yes/yes/no/maybe/no/yes! studies.  These may generally be ignored by our baconophilic society, or they could make lots of people switch to spinach sandwiches, or many other kinds of effects.  This latter is somewhat akin to the quantum mechanical notion that measurement gives only incomplete information but affects what's being measured.

Epidemiological studies of this sort have been funded, at large expense, for decades now, and if there is anything consistent about them, it's that they are not consistent.  There must be a reason!  Is it really that the previous studies weren't well done?  Is it that if you fish for enough items, you'll catch something--big questionnaire studies looking at too many things?  Is it changing behaviors in ways not being identified by the studies?

Or, perchance, is it that these investigators need projects to get funded?  This sort of yo-yo result is very, very common.  There must be some explanation, and that inconsistency itself is likely as fundamental and important as any given study's findings.  Maybe bacon-burgers only are bad for you in some cultural environments, and these change in unmeasured ways, and that varying results are not 'inconsistent' at all--maybe it's the expectation that there's one relevant truth, so that inconsistency suggests problems in study design.  Maybe the problem is in simplistic thinking about risks.

Where do cynical possibilities meet serious epistemological ones, and how do we tell?

Wednesday, October 3, 2018

In order to be recognized, you have to be read: an impish truth?

Edgar Allan Poe was an American short story writer, a master of macabre horror--the 3 G's, one might say: Grim, Gruesome, and Ghastly.  Eeeeek!! If you don't know Poe, a BBC World Service podcast in the series The Forum (Sept 15, 2018) discusses his life and work.  If you haven't yet, you should read him (but not too late at night or in too dark a room!).  The Tell-tale Heart, The Murders in the Rue Morgue, The Pit and the Pendulum, and The Cask of Amontillado should be enough to scare the wits out of you! Eeeeek!!

Edgar Allan Poe (1809-49)
Ah, scare tactics--what a ploy for attention!  At a time when not many people were supporting themselves with writing alone, Poe apparently wrote that this going over the top was justified or even necessary if you wanted to make a living as a writer.  If you have to sell stories, somebody has to know about them, be intrigued by what they promise, go out and buy them.

Is science also a fantasy horror?
Poe was referring to his use of extreme shock value in literature, stories of the unreal.  But a colleague in genetics once boasted that "anything worth saying is worth exaggerating, and worth repeating", and drum-beating essentially the same idea over and over is a common science publishing policy.   This attitude seems schemingly antithetical to the ideals of science which should, at least, be incompatible with showmanship for many reasons.

Explaining science and advocating one's view in responsible ways is part of education, and of course the public whose taxes support science has a right to know what scientists do with the money.  New ideas may need to be stressed against a recalcitrant public, or even scientific, community.  Nonetheless, pandering science to the public as a ploy to get attention or money from them, is unworthy.  At the very least, it temps exaggeration or other misrepresentations of what's actually known.  We regularly see the evidence of this in terms of outright fraud that is discovered, and also yes-no-yes-again results (does coffee help or hurt you?).

This, I think, reflects a gradual, subtle, but for someone paying attention, a substantial dumbing-down of science reporting, by even the mainstream news media--even the covers and news 'n views headlines of the major science journals approach checkout-counter magazines in this, in my view.  Is this only crass but superficial pandering for reader and viewership--for subscription sales, or could it reflect a serious, degeneration in the quality of education itself, on which our society so heavily relies?   Eeeeek!!

In fact, showman scientists aren't new.  In a way, Hippocrates (whoever he was, if any single individual) once wrote a defensive article (On the Sacred Disease) in explicit competition for 'control' of the business of treating epilepsy, an effort to maintain that territory for medicine against competition from religion.  Centuries later, Galen was apparently well-known for public demonstrations of vivisection and so on, to garner attention and presumably wealth.

Robert Boyle gave traveling demonstrations of his famous air-pump, doing cruel things to animals to show that he created a vacuum.  Gall hustled his phrenology theory about skull shape and mental traits.  In the age of sail, people returning from expeditions to the far unknown gave lurid reports (thrills for paying audiences) and brought back exotica (dead and stuffed).  The captain of the Beagle, the ship on which Darwin sailed, brought live, unstuffed Fuegians back to England for display, among other such examples.

Yes, showman science isn't new.  And perhaps because of the various facets of the profit motive (now perhaps especially attending biomedical research) we see what seems to be increasingly common reports of corruption even among prominent senior (not just desperate junior) academic scientists.  This presumably results from the irresistible lure of lucre or pressure for attention and prominence.  Getting funded and attention mean having a career, when promotion, salaries, tenure, and prestige depend on how much rather than on what.  Ah, well, human fallibility!

The daily press feeds on, perpetuates (and profits from) simplistic claims of discovery along with breathless announcements that are often basically and culpably exaggerated promises.  Universities, hungry for grants, overhead, and attention, are fully in the game.  Showboat science isn't new, but I think has palpably ballooned in recent decades.  Among other things, scientists intentionally, with self-interest, routinely sow a sense of urgency.  Eeeeek!!

So should there be pressure on scientists to quiet down and stop relentless lobbying in every conceivable way?  My personal (probably reactionary!) view is a definite 'yes!':  we should discourage, or even somehow penalize showmanship of this sort.  The public has a right to know what they're paying for, but we should fund science without forcing it to be such a competitive and entrepreneurial system that must be manipulated by 'going public', by advertising.  If we want science to be done--and we should--then we should support it properly.

In a more balanced world, if you're hired as a science professor, the university owes you a salary, a lab, and resources to do what they hired you to do.  A professor's job should not depend on being a sales agent for oneself and the university, as it very often is, sometimes rather explicitly today.  Eeeeek!!

The imp of the perverse--in science today
One of Poe's stories was The Imp of the Perverse.  The narrator remarks upon our apparent perverse drive to do just the opposite of what we think--or know--that we should do.

The Imp of the Perverse.  Drawing by Arthur Rackham (source: Wiki entry on the story)
I won't give any spoilers, since you can enjoy it for yourself.  (Eeeeek!!)  But I think it has relevance to today's attitudes in science.  Science should be--our self-mythology is that it is--a dispassionate search for the truth about Nature.  Self-interest, biased perspectives, and other subjective aspects of our nature are to be avoided as much as possible.  But the imp of our perverse is that it has become (quoth the raven) ever-more important that science be personally self-serving.  It is hard to prevent ourselves, our imp, from blurting out that truth (though it is often acknowledged quietly, in private).

On the good side, careers in science have become more accessible to those not from the societal elite.  The down side is that therefore we have to sing for our supper.  Darwin and most others of science lore were basically of independent means.  They didn't do science as a career, but as a calling.

Of course, as science has become more bureaucratic, bourgeois, and routine, Nature yields where mythology--lore, dogma, and religion--had held forth in the past.  So, it is not clear whose interest that imp is serving.  That's more than a bit unnerving!  Eeeeek!!

Science 'ethics': can they be mainly fictional, too?
Each human society does things in some way, and things do get done.  Indeed, having been trained as an anthropologist, perhaps I shouldn't be disturbed or surprised by the crass aspects of science--nor that this predictably includes increasingly frequent actual fraud egged on by the imp of the pressure of self-interest.  Eeeeek!!

Our mythology of 'science' is the dispassionate attempt to understand Nature.  But maybe that's really what it is: a myth.  It is our way of pursuing knowledge, which science, of course, does.  And in the process, as predecessors such as those I named above show, gaming science is not new.  So isn't this just how human societies are, imperfect because we're imperfect beings?  Is there reason to try, at least, to resist the accelerating self-promotion, and to put more resources not just to careers but to the substance of real problems that we ought to try to solve?

Or should we just admire how our scientists have learned to work the system--that we let costly projects become entrenched, train excess research personnel, scare the public about disease, or make glowing false promises to get them to put money in the plate every tax year?  In the process, perhaps real solutions to problems are delayed, and we produce many more scientists than there are jobs, because one criterion for a successful lab is its size.

Were he alive and a witness to this situation, Poe might have fun dramatizing how science has become, though wonderful for some, for many, a horrible nightmare: Eeeeek!!

Tuesday, October 2, 2018

Flirting with old-tyme racism. Is anyone paying attention?

The ability to extract DNA from archeological bone specimens has opened a new area for research to reconstruct the past, but in some senses, this is allowing the field of anthropology to recapitulate its sometimes questionable history.  Anthropology has always been the study of groups of people, often characterized categorically, that is, as if their members were all alike, and were quite different from other groups.

There's a fine line between this kind of typological thinking and the hierarchical ranking of groups, often been aided and abetted by the technologies of the day, from phrenology in the 19th century, which could be used to show, for example, that Africans were born to be slaves, and in need of masters, to the use of DNA markers today, which have been interpreted by some to confirm the existence of biological races, and the primacy of genes over environment in the determination of who we are.  In a time when social policy is too often based on this kind of categorial thinking, with, for example, spreading belief in the evils of immigration, the inherent right of some to more of society's goods, from education to health care to tax relief, etc., our generation's version of "scientific racism" can land on receptive ears.  We cannot assume that the gross evils of the passed are gone, and the lessons learned.

There is a long line of examples of dangerously over-simplified but cute dumbed-down categorical assertions about groups, often in the genetic era from non-anthropologically sophisticated but prominent geneticists.  One from years ago was the sequence of 'mitochondrial Eve', in which a set of mtDNA sequences was used to infer a common ancestral sequence, and that was then attributed to our founding first woman.  There was, of course, one woman in which the imputed mtDNA sequence (or some sequence like it) occurred.  But the rest of that woman's genome, her dual sets of 23 chromosomes, had genetic variation that was also found in countless contemporary women (and men); each variant in each gene found in a different set of those contemporaries, and each 'coalescing' as the term is, in some single ancestral individual at some time and in some place.  This 'Eve' was only our common ancestor in her mtDNA, not her other genes, and so she, as a person, was not our 'Eve'--our shared female progenitor a la Genesis.  Indeed, among all of our genes there was no single, common ancestral time or place--probably not in hundreds of miles, or thousands of years.  Each DNA segment found today has its own ancestry.

Using the 'Eve' phrase was a cute liberty that got the story widely circulated, and as a Barnum & Bailey tactic, it worked very well.  But its reference to the Biblical Eve, a woman from whom all of us are purportedly descended, was culpably misleading even if catchily attention-seeking.  And, of course, the purported common ancestral mtDNA sequence is only inferred in a statistical sense, from today's mtDNA variation. This Eve-imagery came out of the Allan Wilson lab at UC Berkeley, a source of free-wheeling, glibly cute public claims.  That sort of thing gets picked up by a media hungry for cute stories and gives it legs.  So the behavior is rewarded.

More serious abuses of stereotypes
The 'mitochondrial Eve' characterization was cute and catchy, but perhaps harmless.  But categorical oversimplifying by scientists isn't always just cute and harmless.  In my day as a graduate student, a prominent physical anthropologist, at Penn no less, Carleton Coon, said in one of his widely read books on racial variation, that  "No one can express anguish more convincingly by his facial expression than an Italian.  A Negro's facial expression, on the other hand, consists largely of exposing his eyeballs and his teeth.  There is good reason for this difference: the Italian's mobile and moving communication would be lost, under most lighting conditions, on a black face."  

Yet when I was in graduate school, at about the same time as this was published, I took human anatomy at the University of Michigan medical school.  When we got to the superficial facial muscles, here is the illustration of those muscles from my professor's own, prominent, anatomy text:
  


From Woodburne: Essentials of Human Anatomy (4th ed.),  1969

This drawing, uses a black person as an exemplar of human facial muscles.  They are clear and clearly identified as functional; they are not degenerate or minimalized, incapable of full expression.  They are not the muscles of but one category of people: they are the human muscles.

Rumors, at least, were that the eminent Professor Coon had argued, behind the scenes, against integrating schools in the US, on the grounds that 'Negroes' were of intellectually inferior ability.  Categorical thinking, with its typically concomitant value judgments, is nothing new, and it's never over, but sloppy scientific thinking shouldn't contribute to the problem.

Even without making qualitative value judgments, categorical thinking about humans, a form of racism, is historically dangerous, and everyone in science should know that.  Yet, recently, there has been a simple, dramatic story of past human 'breeding' habits that indicates that categorical scientific racism still has legs in our society and, indeed, our most prominent journals. If not intentionally, it's by a kind of convenient default.




Here are the cover, and one of the figures, from a recent issue of Nature.  The embracing hands of people of different 'colors' shown as types who mated, indeed thus producing 'hybrids' between a Neanderthal and a Denisovan parent.  This is a splashy story because these are considered to be different species.  And the journal, naturally, used this as its lurid cover.  The cover figure is about the 6 September story in that issue, from which we reproduce one figure that shows groups represented as regionally distributed people of different color.  Is it unfair to call this stereotyping, of the old-fashioned type, even if only subliminally?  Whatever the intent, the typological thinking is not subtle.

Thinking of this sort should have been long gone from Anthropology because DNA sequencing has clearly shown the internal variation and inter-group (or, better put, inter-geographic) overlap in variation.  But when the publicity engines and the sensationalistic adrenalin are at work in science, whatever sells seems OK.

Even with a very long history of racism, including of course intentional slavery and genocide, we cannot seem to give up on types and categories, even inadvertent habits with no value judgment intended.  But whether intentional and vicious, or merely inadvertent and de facto, this is essentially racism, and should be called out as such.  And racism is dangerous, especially when voiced by scientists who should know better, or even, as I presume in this case, who are not racists in the usual discriminatory sense (that may not apply to their readers!).  As a prominent colleague once said privately to me, he was not a 'personal racist' (he had African friends, after all)--he was just a typologist, a genetic determinist; i.e., a scientific racist.

Even if the authors of the human hybrid piece, happy enough for a cover story in a major journal, are not themselves "personal racists:, they perpetuate classificatory thinking.  Countless people have lost their lives because of careless sloganeering.  No matter its more polite guise, and carefully nonbiological group coloring in the figures, is this any different?

Is science heading back to those good ol' days again?

Monday, October 1, 2018

My sexed-up Jordan Peterson fantasy

Picture me dressed in "a new three-piece suit, shiny and brown with wide lapels with a decorative silver flourish" and my cranium and jaws draped luxuriously, top to bottom, in Jordan Peterson hair.

I even pronounce "so-ree" like he does because I got my first period while living in Ontario, so I earned the right. Also, I don't care if it offends him or you.

I am pulling this off beautifully. Trust me:

This is me being Axl Rose.

Jordan Peterson and I, his chaotic mirror, are sitting across from one another in comfy leather armchairs, with nothing in between to break the gaze of my crotchal region pointed at his crotchal region. I'm as cool as he is. I don't have to act like I've been here before because I have.

I'm leaning in, in the most specific way, towards Jordan Peterson, a model of human success.

I ask my first question.

"Could you please lay out the scientific logic linking lobsters to the patriarchy?"

He says many things, including that people obviously aren't lobsters and how we aren't even chimpanzees, which is reasonable because he is skilled at reason.

Then I say, "You're a man of science. So, what do you have to say about any evidence that contradicts your ideas about the natural ways of hierarchies and how they're particularly relevant to human society? Also, have you thought of any ways your ideas could be tested?"

He says many things, but they have little to do with taking contradicting evidence seriously or having thought through the difficulty of testing much of what's fundamental here. And this is largely because, despite the veneer of science, these ideas have breeched the bounds of reasonable, feasible testing.

"What about this, eh?" I offer, "We take the top lobster from the west side of Prince Edward Island and move him to the east side of the island. Is he going to be the top lobster there too? And would this same experiment work for chimpanzees? Or for humans?"

He says many things but will make it seem like there is no point to what I asked. The idea that humans are not lobsters or chimpanzees will resurface--tethering him, once again, to reason and nuance yet not actually producing anything of the sort for us.

So I continue, "Because if this link to lobster hierarchies is supposed to go not just from lobsters to humans, but to individual humans and their natural strengths and weaknesses compared to other individuals, then context shouldn't matter and if we transplant a lobster or a human then they should each assume their natural position in the local social hierarchy, wherever we plop them."

He says many things that sound reasonable.

Then I add "It's not just the top lobster's lived experience that contributes to his place atop the hierarchy, it's everyone else's below him in that hierarchy too--right?"

He must agree with this and does. He says many things that sound reasonable.

"And so where do you change from lobster to human and acknowledge that bad luck and circumstances of birthplace and family and everything else stick people lower in the hierarchy than they could be in different circumstances?"

He says many things about how this is absolutely true for so many young white men in North America all of whom can improve their station if they just read 12 Rules For Life and fill out the worksheets.

"Could a lobster improve its station in any way comparable to what you suggest for your readers?"

He isn't having my silliness.

"Okay, let's back up. What is evolution?"

He's stunned but plays it off perfectly.

"How would you define evolution to your readers/viewers?"

He says many things straight out of Descent of Man and nods to some of the least huggable atheist superstars.

"It's not a single reader or viewer's fault that they don't know any better and just believe your evolutionary insinuations and assumptions, but don't you think you should know better? ... If I were to become globally influential and I wanted to share ideas about clinical psychology that would influence masses of people, being the Ph.D. and professor that I am, I would go to the cutting edge of the field and learn there first before going public. I would try to understand what is known and what is unknown in that field, your field, and appreciate how those things are known and why there still are unknowns. And, if my masses of followers were misunderstanding my take on clinical psychology or my takes on my own areas of expertise, being the Ph.D. and professor I am, I would go out of my way to clarify my ideas in hopes they'd understand me better. Do you do that with the ways that folks interpret your views, like how some take you literally about sex distribution and enforced monogamy?"

He says he thinks his readers/viewers understand him quite well on those topics. This is vague and elusive and I move on because I feel an odd mixture of disappointment, pity, and disgust and I'd like to leave it behind as soon as possible.

"I believe with confidence that there are fundamental cognitive differences between humans and other animals. Do you agree?"

He says he does.

"Do you agree that these cognitive differences have contributed significantly to our domination of the planet?"

He says sure, of course.

"So why be so limited in your vision for achieving equal freedom, equal opportunity? All I've heard from you is that men should act more masculine (and less feminine) and that so should women if men and women want to succeed. Do you have any other ideas?  Or is that the extent of your imagination? Because it feels like quite an underappreciation of humankind to me. Like, the opposite of a moonshot, eh?"

He says many things that sound epic, I guess, if you are already a fan of him.

"Lobsters don't go to the moon. I think humankind can do better than just man-up."

He just plays it cool in his chair, there. And we can hear his fans all over the world laughing at me. No amount of masculine dress, hair, or swagger can disguise my big powerful lady PhD. And that's not just hilarious, but it also proves that men hold disproportionate power because of simple lobster logic. I've been dominated, which makes Jordan Peterson even more right about everything.


Thursday, September 13, 2018

From Darwin's own thoughts. Part IV.

Here is the fourth, and final installment of annotated quotes from Charles Darwin's autobiography (my comments in blue):

"As soon as I had become, in the year 1837 or 1838, convinced that species were mutable productions, I could not avoid the belief that man must come under the same law. Accordingly I collected notes on the subject for my own satisfaction, and not for a long time with any intention of publishing."

Darwin realizes that humans must have evolved, too.  There was no reason, from what he could observe, to except us.  But he knew the dangers of saying so!  We are still under some societal pressure to disavow evolutionary theory, not unlike his days.  Denial is all around us....

"My strength will then probably be exhausted, and I shall be ready to exclaim "Nunc dimittis.""

That means something like 'Enough--time to go!'

"My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive. A man with a mind more highly organised or better constituted than mine, would not, I suppose, have thus suffered; and if I had to live my life again, I would have made a rule to read some poetry and listen to some music at least once every week; for perhaps the parts of my brain now atrophied would thus have been kept active through use.  The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature."

Darwin bemoans his intellectual narrowness.  He elsewhere (as we've seen) refers to books or poetry he no longer reads or once liked.  How many of us are pressured to be technophilic workaholics, letting some of the deepest pleasures of life pass us by?  Take heed!

"....the 'Origin of Species  is one long argument from the beginning to the end, and it has convinced not a few able men."

"On the favourable side of the balance, I think that I am superior to the common run of men in noticing things which easily escape attention, and in observing them carefully."

A lesson might be not to fall into just accepting fads or what is selling these days, but to pay attention, for yourself, to the nature of what you are studying.

"This has naturally led me to distrust greatly deductive reasoning in the mixed sciences. On the other hand, I am not very sceptical,— a frame of mind which I believe to be injurious to the progress of science."

He means observe the world as it is, don't just accept theory or speculative ideas uncritically.  In not being skeptical he may mean that he believes there are truths out there, generalities or theories, that can be accepted if there is supporting evidence.

"So that here a belief— if indeed a statement with no definite idea attached to it can be called a belief— had spread over almost the whole of England without any vestige of evidence."

One can identify fads (like 'Big Data' or 'precision genomic medicine' or other 'omics?) that feed the System.  But maybe closer, clearer, more patient thinking and observing could serve our impatient results-counting generation well instead.

"Lastly, I have had ample leisure from not having to earn my own bread."

Ah, to be wealthy!

.  .  .  .  .  .  .  .  .  .

My afterthoughts:
So, why did I extract and post these bits by Darwin?  I think he gives us a lot to think about.  This is not just about particular facts, or even the idea of evolution which was, in fact, 'in the air' despite Darwin's denial.

We might not have had the same exact theory today, as Darwin proposed it.  We have vastly more data, an understanding of inheritance, and even much better ideas about species distributions (for example, we know about continental drift).  We have more subtle understanding of the roles of natural selection and chance (genetic drift).  We have much more information on the complex nature of genetic control on our traits.  A century and a half of research, framed by the notion of evolution has framed our investigations themselves.  But the central idea of evolution stands without doubt among scientists.  Nothing diminishes Darwin's patience, observations, experiments, persistence, dedication to using detail to build a general picture, and, by no means least of all, his honor.

Careerist pressures are today often, if not largely, antithetical to the personal traits that worked so well for Darwin.  So, aside from seeing the thoughts of one of the most insightful of all scientists, there are lessons for our own time in Darwin's reprise of his life.

Wednesday, September 12, 2018

From Darwin's own thoughts. Part III.

This is the third installment of my annotated selection of pithy quotes from Darwin's autobiography (my comments in blue):

Darwin and Wallace both felt inspiration, or a vital explanatory link, in Thomas Malthus' book:
"In October 1838, that is, fifteen months after I had begun my systematic enquiry, I happened to read for amusement 'Malthus on Population,' and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. The result of this would be the formation of new species."

Darwin's reflections on his theory of evolution and his most important work:
"The solution, as I believe, is that the modified offspring of all dominant and increasing forms tend to become adapted to many and highly diversified places in the economy of nature."

"This shows how necessary it is that any new view should be explained at considerable length in order to arouse public attention."

"Though considerably added to and corrected in the later editions, it has remained substantially the same book....It is no doubt the chief work of my life."

"It has sometimes been said that the success of the 'Origin' proved "that the subject was in the air," or "that men's minds were prepared for it." I do not think that this is strictly true, for I occasionally sounded not a few naturalists, and never happened to come across a single one who seemed to doubt about the permanence of species."

Darwin was most elaborating and saw deeply, but actually, there were ideas floating around, as Wallace (but also others, too) proves.  This undoubtedly affects ideas we think about today, too, but we can find our like-thinking contemporaries--or predecessors--by web-searching (if we want to).

"Another element in the success of the book was its moderate size; and this I owe to the appearance of Mr. Wallace's essay; had I published on the scale in which I began to write in 1856, the book would have been four or five times as large as the 'Origin,' and very few would have had the patience to read it.

I cared very little whether men attributed most originality to me or Wallace; and his essay no doubt aided in the reception of the theory."

Darwin, ever the generous man.  But his societal advantages guaranteed his preeminence, and he was more thorough.  Wallace's ideas were more group- than individual-oriented, and may have some merit although Darwinites take pleasure in scorning them.  And nobody can refer to most of Darwin's books as other than quite long!

"Hardly any point gave me so much satisfaction when I was at work on the 'Origin,' as the explanation of the wide difference in many classes between the embryo and the adult animal, and of the close resemblance of the embryos within the same class. No notice of this point was taken, as far as I remember, in the early reviews of the 'Origin,' and I recollect expressing my surprise on this head in a letter to Asa Gray.

Darwin relied on embryological data and ideas from his predecessors, particularly von Baer  in Germany.

"Towards the end of the work I give my well-abused hypothesis of Pangenesis. An unverified hypothesis is of little or no value; but if anyone should hereafter be led to make observations by which some such hypothesis could be established, I shall have done good service, as an astonishing number of isolated facts can be thus connected together and rendered intelligible."

Darwin is right about his idea of Pangenesis (the idea that each part of the body emits particles that he called 'gemmules' that congregate in the gonads and contribute to the gametes) being of 'little or no value', and indeed it was very much like the Lamarckian view he sneered at.  But what is the role for free speculation in science?  Darwin's speculating was pretty clear, since there was no data supporting 'gemmules' or 'pangenesis'.  Yet, as long as they are clearly  marked as guessing, can intelligent guessing stimulate creative thought?  Is it better, or worse, than just saying "we have no idea how this works...."?

Tuesday, September 11, 2018

From Darwin's own thoughts. Part II.

This is part two, a continuation of my annotated selection of quotes from Charles Darwin's autobiography (my comments in blue):

"In July I opened my first note-book for facts in relation to the Origin of Species, about which I had long reflected, and never ceased working for the next twenty years."

Again, evidence of such patience!  And the result shows why one need not, perhaps should not, rush to publish.

(By Charles Lyell, the distinguished senior geologist and close Darwin friend): ""What a good thing it would be if every scientific man was to die when sixty years old, as afterwards he would be sure to oppose all new doctrines." But he hoped that now he might be allowed to live."

Keeping new and fresh while getting old?  Rarely!  Make your mark while you can....

"Our fixing ourselves here has answered admirably in one way, which we did not anticipate, namely, by being very convenient for frequent visits from our children."

Keeping priorities about what means most in life.

"I have therefore nothing to record during the rest of my life, except the publication of my several books."

On his very extensive many-years' work on barnacles:  
"Nevertheless,  I doubt whether the work was worth the consumption of so much time."

The question and how he mused about it:
"It was evident that such facts as these, as well as many others, could only be explained on the supposition that species gradually become modified; and the subject haunted me. But it was equally evident that neither the action of the surrounding conditions, nor the will of the organisms (especially in the case of plants) could account for the innumerable cases in which organisms of every kind are beautifully adapted to their habits of life— for instance, a woodpecker or a tree-frog to climb trees, or a seed for dispersal by hooks or plumes. I had always been much struck by such adaptations, and until these could be explained it seemed to me almost useless to endeavour to prove by indirect evidence that species have been modified.

I worked on true Baconian principles, and without any theory collected facts on a wholesale scale, more especially with respect to domesticated productions."

Darwin the observer, not the rusher to conclusions!

"I soon perceived that selection was the keystone of man's success in making useful races of animals and plants. But how selection could be applied to organisms living in a state of nature remained for some time a mystery to me."

Yes, agriculture has done it--changed species' traits--but how does it happen in Nature?

Monday, September 10, 2018

From Darwin's own thoughts. Part I.

I have just been re-reading Charles Darwin's autobiography.  He wrote it with his son's encouragement shortly before the great man passed away in 1882, and was first published in 1887.  I think he wrote it to tell his children and so on about his famous life.  Yet as famous as he had become, he is as modest as one would expect from that exemplar of the best of humanity.

I encourage anyone in the life sciences, who doesn't presume to think s/he already knows everything, to read it, for reasons I'll suggest below.  It has various versions, as his son Francis edited it a bit, redacting some personal family-related  comments (these were later restored, but are unimportant here). You can find it here.

I thought that some of the things he said would be worth posting on a site like this.  Darwin was right about many things, and even he was wrong about others (as, indeed, he himself freely says).  But it is his thinking, his perspective, standards, reasons, and outlook that are important.  So what follows are some quotes that I chose (easy to find by searching the ebook), with my reflections separated in italics and in blue.  (Because there are many pithy quotes, I've split this into four successive posts):

"To my deep mortification my father once said to me, "You care for nothing but shooting, dogs, and rat-catching, and you will be a disgrace to yourself and all your family."

Darwin was an idler as a privileged young gentleman, but, to our great benefit, circumstances grabbed his attention and serious side. And he explains it thus:

"Looking back as well as I can at my character during my school life, the only qualities which at this period promised well for the future, were, that I had strong and diversified tastes, much zeal for whatever interested me, and a keen pleasure in understanding any complex subject or thing.


I mention this because later in life I wholly lost, to my great regret, all pleasure from poetry of any kind, including Shakespeare."

Darwin more than once admitted, or even bemoaned, his narrow focus and neglect of some of the finer things in life.  Yes, he was successful, but this could be a lesson for us all: keep a balance!

"I almost made up my mind to begin collecting all the insects which I could find dead, for on consulting my sister I concluded that it was not right to kill insects for the sake of making a collection."

Even Darwin saw the evil in killing other living things just to gawk at them.  We do it routinely, even including mammals (mice, etc.), but to salve our conscience (for those who have one) we get IRB approval first, to keep their suffering under at least some constraint and prevent our suffering from lack of a project to do.

"This was the best part of my education at school, for it showed me practically the meaning of experimental science."

He observed rather than simply conjectured, and his patience and eye for detail and for identifying the critical variables were at the root of his success.

"...but to my mind there are no advantages and many disadvantages in lectures compared with reading."

Ooops, professors!  Some of us do need to hear a message live and have it explained.  Darwin, though, had the drive, and patience, to study a subject in great detail.  How many of us have that?

"At this time I admired greatly the 'Zoonomia;' but on reading it a second time after an interval of ten or fifteen years, I was much disappointed; the proportion of speculation being so large to the facts given.

....in after years I have deeply regretted that I did not proceed far enough at least to understand something of the great leading principles of mathematics, for men thus endowed seem to have an extra sense."

He reasoned in his own way, and did't really suffer by his non-numerical abilities.  Maybe he was not misled by math's oversimplification and rigidity?  Maybe we rely far too much on the latter, as a safer and quicker course to 'results', than patient, deeper thinking?

"During my last year at Cambridge, I read with care and profound interest Humboldt's 'Personal Narrative'."

"...science consists in grouping facts so that general laws or conclusions may be drawn from them."

If he was anything, it was a patient, careful sponge for detail.  And reading stimulated his original thinking.

"I heard that I had run a very narrow risk of being rejected, on account of the shape of my nose! He was an ardent disciple of Lavater, and was convinced that he could judge of a man's character by the outline of his features; and he doubted whether any one with my nose could possess sufficient energy and determination for the voyage."

Here he's talking about having almost been rejected for the voyage that became the basis of his life's work by Fitzroy, the captain of the Beagle.  Beware of hoaxes even in science!  We see unwarranted speculative conclusions being asserted almost every week in the news media, and even in journals (though there, couched in dense professorialized terms!).

"The investigation of the geology of all the places visited was far more important, as reasoning here comes into play."

"Everything about which I thought or read was made to bear directly on what I had seen or was likely to see; and this habit of mind was continued during the five years of the voyage."

Again, his integrative detail-sponging patience and reasoning.  No rush to conclusions (or to print).  Indeed, he waited for 25 years before publishing his ideas on evolution, and only did so then when he was prompted by Alfred Russel Wallace's discovery of the same ideas.  

"The sight of a naked savage in his native land is an event which an never be forgotten."

(Here he's writing of being in Tierra del Fuego.  But he did not think of such people as inferior, as his experience later makes clear)

"Nor must I pass over the discovery of the singular relations of the animals and plants inhabiting the several islands of the Galapagos archipelago, and of all of them to the inhabitants of South America."

We know how important that set of observations was!  The islands are still under close observation.

"But I was also ambitious to take a fair place among scientific men,— whether more ambitious or less so than most of my fellow-workers, I can form no opinion."

"...I am sure that I have never turned one inch out of my course to gain fame."

Ambition, yes--but egotism and show-boating, never: no rushing to the news media, no spin doctors!

Tuesday, September 4, 2018

What's causing the cataract epidemic?

Strangely, we seem to be in the midst of a cataract epidemic!  It seems that everyone I know is having to have their lenses replaced, and I have been experiencing glare and other issues that suggest the same may be looming for me.

It is the personal experience that drew my attention to this epidemic.  That is, I have seen no stories about this in Science or the public news media.  That is very strange, since they seem to seize upon any even marginally interesting story.  Yet this epidemic has not hit the headlines, at least not yet.

So what could be causing it?  This is far, far from my area expertise, so I can only speculate in very generic terms. What has changed that might have epidemic consequences?  Here are a list of candidate factors that may have changed recently enough to be responsible (but I confess it is just a guess-list):

(1) Is it food?  With processed and GMO animal and plant foods increasing their prevalence, widely and recently, as well as pollution of the seas from which seafoods come, an obvious suspect would be dietary.  Junk foods and other such habits could have effects that are subtle but accumulate, with delayed-onset vision consequences.  Dietary factors clearly are, after all, responsible for many 'modern' diseases. 
(2) Is it viral or infectious?  So many of us move around the country and, indeed, the world that any virus arising even in a remote part of the globe can rapidly spread.  Air transport from tropical to temperate zones would be a major suspect, as would international transport of goods and so on.  But what virus or infectious agent might be involved is unclear, nor do we know where a similar epidemic might be occurring, unreported. 
(3) Is it recent environmental contamination, in air or water?  It is impossible to ignore global air and water pollution as potential cataract causing factors--even if the mechanism itself is not known.  After all, has it even been examined, given that the epidemic hasn't yet really been recognized?  Air and water currents circulate widely around the world, which could make the causal source of the epidemic very distant from the consequences.  It might not even be suspected.  After all, the lens is a very special kind of tissue and a connection may be very strange relative to what the usual exposure-consequence studies look for, not to mention the statistical methods used which could be quite inapt. 
(4) Or, being very cynical, could it be marketing or profiteering by the companies that make the gear and supplies that are involved?  In that case, perhaps the idea that there is a real epidemic may be mistake--we would just be being told that there was.  This subjective, cultural sort of factor would be vary difficult to document.  After all, gear-makers do need to make sales of their gear.
Overall, I am stunned at what seems to me to be an obvious but almost wholly unreported yet major epidemic that seriously threatens quality of life.  As I must acknowledge in what is a speculative blog post, the whole story could be my ignorance of the literature.  I am a geneticist, but I have seen no reason to think this could be genetic, since our population's genotypes have not changed in any substantial way in the recent decade or so.  But if not genetics, then what?  One must at least ask that question!  Yet, strangely I think, the professional epidemiological literature seems to have ignored or even been unaware of the major epidemic, which from my point of view seems clear to me; such neglect is very hard to understand, since (again, to be rather cynical) epidemiologists are naturally eager for any Big Problem they can find to justify Big Studies.

This blog site is usually about genetic causation and its associated scientific issues, but once the question of the seeming neglect of a cataract epidemic struck me, I decided I should at least use Mermaid's Tale to air it.  I would welcome any comments that raised original or even partly plausible explanations.

However, and finally, I must end by acknowledging that I have done no rigorous study of the cataract problem.  Indeed, at my age, I should long ago have learned to avoid the temptation to talk to so many of my peers about it.  It could frighten them.  Before suggesting that it is a major epidemic, I should think about who I'm talking to.

Monday, September 3, 2018

Luigi Luca Cavalli-Sforza (1922-2018), worth remembering

Luigi Luca Cavalli-Sforza has died, at age 96.  Who?  I wonder how many readers of this blog, or in general how many anthropologists or human geneticists of less than middle-age, know who he was.  We are not, these days, in the habit of crediting the past.  But Luca, at least, is worth remembering--for those who remember him.

Source: www2.unipr.it

Born in Italy, Luca received his MD there and then, when WWII ended, studied population genetics with RA Fisher in England.  He spent most of his career at Stanford, where he taught and did his creatively integrative and theoretical work on human variation and its evolution.  He collaborated with many colleagues from around the world.  He was, from my graduate-student years on, probably the leading human population geneticist in the world.  He developed numerical methods for analyzing human allele-frequency variation and relating the pattern of that variation to global population history, relating that variation to other kinds of data.  In particular, he was interested in the relationship of language patterns to genetics, and the causal relationship between cultural dynamics--such as the spread of agriculture--and genetic diversity, and he developed ways to analyze how the latter could be used to help reconstruct the former.

I was incredibly fortunate to have been able to spend a sabbatical in Luca's lab at Stanford, in the late '70s.   I had no particular 'project' to work on but, typically for him, he hosted me anyway.  It was enough for me to know that he and his associates were leaders in human population genetics, as a science per se, but also that they were so original and creative in relating genetic variation to cultural, language, and technological history.  His was a synthetic view.  He was technically original and advanced, but based on innovative and integrative thinking.

Luca wrote much, but his two most memorable and durable books, that encapsulated much or most of his interests are (1) The Genetics of Human Populations, with co-author Walter Bodmer (W. Freeman, 1970; and a subsequent watered-down version with author order reversed), and (2) the massive History and Geography of Human Genes, with P. Menozzi and A. Piazza (Princeton Press, 1994).  Both are still available, I think, the former in a Dover reprint.  The Genetics of Human Populations was a digestible, but sophisticated version of population genetics theory and method, suitable for understanding human origins, allele frequency variation, and evolution.  Many anthropologists and others learned their trade from this book.  The second book was the final word on traditional allele-frequency (rather than DNA sequence) based reconstructions of human global variation.






Luca worked on topics too numerous to go over here.  But this is very well described by John Hawks' fine summary of Luca's work and Wikipedia: Luigi Luca Cavalli-Sforza provides references.  For anyone even remotely interested in the history of anthropological genetics and its contribution to human evolution and culture history, it will be worth the effort to be familiar with these foundational contributions.

Luca was a central figure in the attempt to organize a worldwide, systematic sampling of human variation (called the Human Genome Diversity Project, or HGDP).  That project never took place as such, because, economically it came into funding conflict with the Human Genome Project, to generate a sequence of a representative complete human genome, and, politically because scurrilous accusations were leveled against the HGDP by those who saw it as a project to categorize people exploitively, much as racism does; this was grotesquely false and opportunistically culpable on the part of jealous or ignorant critics and scandal-thirsty journalists.  However, the stir provided NIH with a safe excuse not to fund the HGDP.  Instead of a formal, globally systematic project, Luca used the heterogeneous blood-group and protein variation data already collected over many years by various investigators around the world, to show global patterns of human gene frequencies.  His tome (#2 above) presented these data, much of which are still available.  Luca and colleagues developed methods for analyzing the pattern in relation to historical or prehistorical (assumed) human demographic behavior.

As science history often goes, this approach was soon to be pre-empted by DNA sequencing technology, and individual genome sequences supplanted protein and antibody-based allele-frequencies as the primary data for studying human variation and evolution (Ken Kidd at Yale, and a lifelong colleague of Luca's and an organizer of the HGDP effort, has maintained a very useful site for allele frequency and other data).  In this sense, historically, Luca's Big Volume was the last word on the earlier technology.  But of course similar attempts to reconstruct not just history itself but to integrate that with other aspects of human existence--are actively being pursued by many people, and this can now be extended far back in time thanks to the ability to extract DNA from fossils.  Nonetheless, in terms of the history of anatomically modern humans, the basic outlines in Luca et al.'s book, based on sample allele frequencies, I think still generally hold.

Ephemerality's children
I'm not sure how long Luca will be remembered.  What I write here is a paean to a wonderful person and terrific scientist.  But there is no single 'discovery' nor Cavalli-Sforza 'theorem' or the like, that will be, by being named for him, his lasting legacy.  He was not a grandstander, didn't play to the media, and his students and colleagues are now very senior.  The present formula in science has little interest in crediting the past (it's not good for careerism), and that generally also often means not reading its lore, either.  As happens in science, on technology supplants prior ones, and DNA sequence and other 'Big Data' and 'omics clearly and rightly have co-opted the less informative data types of Luca's era.  In some sense this does vitiate earlier methods as well as data.  The new data have also enabled publication that is very technical, but in part for the public glamor of technology is less closely tied to deeper, integrative, thought.  Nor was Luca the first to use population genetic data to look at human local or global history--studies of blood group variation, for example, well antedate his work, even if generally more crude in method and detail.

In that sense, Luca helped set the stage, but his timing was all wrong.  Ah, well.  Had he been starting now, with the technologies and database resources currently available, and the much more direct data of DNA sequence (and other 'omics), rather than allele frequencies from population samples, he would have made perhaps a more durable mark, and I suggest that he would have done so more deeply than was possible from the earlier data to which history limited his attention, and not in the hasty rush to print that is now so prevalent.

I fear these may be the hard realities of history.  But thoughtfulness, intelligence and, not least, personal grace only come along sporadically.

Cavalli was a gem of his time.

(This post has been edited to make minor typological corrections)

Friday, August 17, 2018

I'm still mad about the Google Memo and David Brooks's column about it


In 2017 there was the Google Memo (When your memo's bad theories give girls heebie jeebies ... That's Damore) and then David Brooks supported Damore in his New York Times column. So I  pitched a reply to the New York Times (rejected by silence) but never posted it here because I was department Chair and [fill in the blank with your wildest dreams].

But it’s not too late to post my thoughts here, and they’re still fresh in my frontal because I’m in the midst of some writing projects where I’m happily channeling my rage against the misuse of my beloved evolutionary thinking. 

So, mermaids, here’s that response to Brooks.  P.S.  I’m on sabbatical, so pardon my fucking French …

***
Dear Editor,

I write to you regarding David Brooks’ column about the firing of ‘Google memo’ author Damore titled , “Sundar Pichai Should Resign as Google’s C.E.O.” I offer some corrections and context for Brooks’ innumerable readers.

There is no debate about human nature being either, on one side, a blank slate or, on the other, evolutionary psychology. The debate pitting nature against nurture is long over and I tell all my students that anyone who says it's still a thing is mistaken. Everyone by 2017 agrees that genes + environment  shape an individual human's behavior over their lifetime (if one must boil biological complexity down to two vague, enormously complicated variables and simple arithmetic).

What is more, the description of evolutionary psychology Brooks provides (genes + environment), while it may describe the perspective of many evolutionary psychologists, is not a description of the field as implied. It describes what experts think across *many many* fields, including evolutionary biology, anthropology, and genetics, even the humanities, where many researchers and scholars are not terribly fond of evolutionary psychology, at least not with the simple, deterministic, overly-confident brand that folks like Damore and Brooks wave about.

By giving this particular brand of evolutionary psychology credit for what most experts in many fields already believe, Brooks has elevated it to the status that Damore did in the memo. Both Brooks and Damore are misleading their audiences about the state of science itself and it's ingenious because it helps them perpetuate the image they want to portray: that science is on their side. It is not.

And Brooks does it again when he quotes evolutionary psychologist, Geoffrey Miller, as a sort of fact-check of Damore’s claims in the memo. Brooks' presentation of Miller's validation leads readers to believe that the empirical support for sex differences is the product of evolutionary psychology. But these data are the products of numerous fields, psychology being one and evolutionary being the theoretical prerogative of some. I’m sure that every one of the scientists and scholars who produced the empirical data to establish sex differences in behavior and personality accept the reality of evolution, but evolutionary psychology, especially this particular brand, is something different.

Most people who have really grappled with how evolution works appreciate its complexity. Unfortunately these usually do not include people with tremendous influence, like Brooks. And Brooks is smitten with some problematic takes on the evolution of sex and gender differences in behavior.  

Could this ignorance, manipulation, or flat-out dishonesty--all with negative consequences for women and people of color--be what was so offensive about the ‘Google memo’ and Brooks’s column to the minds of many academics, instead of it being just some knee-jerk liberal reaction by leftist elites with weak, unscientific cognitive skills? Absolutely.

Evolution is true but it’s complicated and sticking to overly-simplistic and out-dated thinking makes it easy to bend to fit and justify one’s worldview. This is why racists think white people are the pinnacle of evolution. Darwin might have in the nineteenth century, but evolution in 2017 does not. 

Lest readers assume that because I am a female anthropology professor that I am diametrically opposed to the entire enterprise of evolutionary psychology, I am not. But I am critical of its over-zealous application to conceptions of 'human nature' and that's because (1) I regularly take scientific issue with the logic behind the claims, and (2) I understand the history of science and how many mistaken evolutionary claims have harmed human beings, and still do.

And it is really a shame that I have to add something like this but it's *because* intelligent people like Brooks and Damore don’t give enough fucks to think deeply about evolutionary biology, what it is and isn't, that they're able to empower their opinions with old, bad, weak, even untestable 'science.'

I wish I could say that in 2017 people, even the very learned ones, were cautious about what they can and cannot claim about complex phenomena. Here’s to a more humble, more fun future where we can actually figure cool shit out.

Evolution is everyone’s origin story. But takes like Brooks’ and Damore’s drive people away from the thing that gives me so much meaning and the thing I find so beautiful. So here I am. Sincerely,
Holly Dunsworth

Thursday, August 16, 2018

The Litella Factor: Changing the claimspace of science

We may be starting to see rationalizations and wiggle-words as investigators gradually inch away from many of the genomics-based claims, such as last year's slogan du jour that we're going to deliver 'precision' genomic medicine, or this year's that we'll find genomic causes of disease for 'All of Us'.  Science, of all human subjects, should be objective about the world and not sloganeering even if to wangle ever more funding from the public.  Many are by now quietly realizing not only that environments are important, which is nothing new though minimized by geneticists for a generation, but also that genomics itself is more complex, more variable, and less predictively powerful than has been so widely and often touted in recent years.

We've known the likely nature of genomic causal contribution complexity for literally a century (RA  Fisher's 1918 paper is the landmark).  The idea was a reasoned way to resolve what appeared to be fundamental differences between classically discrete Mendelian traits that took on only one or two states (yellow or green peas), and classically quantitative 'heritability' based traits that seemed to vary continuously (like height) and that as a result were presumed to be the main basis of Darwinian evolution.  The former states seemed never to change, and hence to evolve, while selection could move the average values of continuous traits.

The resolution of these two seemingly incompatible views came from the idea that complex traits were produced by many individual 'Mendelian' genes, but each with a very small effect, was a major advance in our understanding of both heritable causation and the evolution of life: agricultural and experimental breeding confirmed this 'modern synthesis' of evolutionary genetics to an extensive and consistent if implicit degree for a century.  

However, the specific genes that were responsible were largely implicit, assumed, or unknown.  There was no way to identify them until large-scale DNA sequencing technology became available.  What genomewide mapping (GWAS and other statistical ways to identify associations between genetic variants and trait variation) has shown is (1) that century-old model was basically right, and (2) we can identify many of the myriad genome regions whose variation is responsible for trait variation.  This was given a real boost in public support by the fact that many diseases were familial and, even more, that if our diseases and other traits are genetic, we can identify the responsible genes (and, hopefully do something to correct harmful variants).

Phenotypes and their evolution (effects on health and reproductive success) are in this context usually based on the individual as a whole, not individual genes--say, your blood pressure's effect on you as a whole person.  That is, the combinations of polygenic effects that GWAS has identified typically differ for each person even if they have the same trait measure.  We have also found something that is entirely consistent with the nature of evolution as a population phenomenon.  That is that much of the contributing genomescape for a given trait (like blood pressure) involves genome sites whose relevant variants have very low frequency or effects too small to measure with statistical 'significance', so that only a fraction of the estimated overall genetic contribution in the population (measured as the trait's 'heritability') is accounted for by mapping.  All of this has been a discovery success that is consistent with what was the basic formal genetic theory of evolution, developed over the 20th century.  

Great success--but.....
The very same work, however, has led to a problem.  This is the convenient practice equating induction with deduction.  That is, we estimate the risk effects of genomic sites from samples of individuals whose current trait-state reflects their genotype and their past lifestyle exposures.  For example, we estimate the average blood pressure with sampled individuals with some particular genotype.  That is induction.  But then we make a prediction that we promise that from a new person's genotype we can, with 'precision', predict his/her future state.  That is, we use this deductively to assume that the average from past samples is a future parameter--say, a probability p of getting some disease.  That is essentially what a genotype-specific risk is.

But this is based on the achieved effects of the individuals' genotypes at the test and other genome sites as well as lifestyle exposures (mainly unmeasurable and unknown).  We assume that similar factors will apply in the future, so that we can predict traits based on genome sequence.  That is what (by assumption) converts induction to deduction.  It rests on many untested or even untestable assumptions.  It is a dubious port of convenience, because future mutations and lifestyle exposures, which we know are crucial to trait causation, are unpredictable--even in principle.  We know this from clearly documented epidemiological history: disease prevalences change in unpredictable ways so that the same genotype a century ago would not have the same phenotype consequences today.

So, while genetic variation is manifestly important, its results are complexly interactive, not nearly the simple, replicable, additive, specific causal phenomena that NIH has been promising so fervently to identify to produce wonders of improved health.  It's been a very good strategy for securing large budgets, and hence very good for lots of scientists, and perhaps as such--its real purpose?--it is a booming success.  It did, one must acknowledge, document the largely theoretical ideas about complex genotypic causation of the early 20th century.  But the casual equating of induction with deduction has also fed a convenient ideology that has not been very good for science, because science should shun ideology: in this case the idea of enumerable, essentially parametric causation is wrongly and far too narrowly focused.  

Perhaps some realization is afoot
But now we're seeing, here and there, various qualifiers and caveats and soft, not fully acknowledged, retreats from genomics promises.  Some light is being shown on the problems and the practices that are common today.  Few if any are admitting they've been too strident, or wrong, or whatever, but instead are asserting their view as either what we all already know, or as a kind of new insight they are making etc.  That is, claiming that things aren't so genomically caused is a claim of original insight and hence new or continued funding.  No apologies, and no acknowledgments of those critics of the current NIH-promoted Belief System, who have been pointing these things out for many years--no offer of Emily Litella's quiet and humble recognition of a mistake:  "Oh.....Never mind!"


"Oh.....Never mind!"  YouTube from NBC's SaturdayNightLive
How seriously should the quiet backtracking be challenged about this?  Is it even fair to call the revisionists 'hypocrites'?  We live and learn via science, so perhaps the claimscape change, though quiet and implicit, is a reflection of good science, not just expediency.  Perhaps that is how science should be, reacting, even if slowly, to new knowledge and giving up on cherished paradigms.

One underlying aspect of modern science is not that we can accept wrong notions, but our hasty, excessive claims rushed to the public, the journals, and the funders. In a sense, this isn't entirely a fault of vanity but of the system we've built for supporting science. A toning down of claims, shunning those who claim too much too quickly, and much higher threshold for 'going public' would improve science and indeed be more honest to the public.  A stone-age suggestion I've made (almost seriously) is that journals should stop publishing any figures or graphs (in the pages or on the cover) in color--that is, to make science papers really, really boring!  Then, only serious and knowledgeable scientists would read, much less shout about, research reports (maybe some black-and-white TV science reporting should be allowed, too).  At least, we are due some serious reforms in science funding itself, so that scientists are not pressured, for their very career survival, into the excessive claimscape of recent years.

In specific terms, I personally think that by far the most important reforms would be to limit the funding available to any single laboratory or project, to stop paying faculty salaries on grants, to provide base funding for faculty hired with research as part of their responsibilities, and decoupling relentless hustling for money from research, so that the science rather than the money would be in the driver's seat.  
Universities, lusting after credit score-count and grant overheads, would have to quiet down and reform as well.  

The infrastructure is broad and altering it would not be easy. But things were once more sane and responsible (even if always with some venal or show-boat exceptions, humans being humans). But if such reforms were to be implemented, young investigators could apply their fresh minds to science rather than science hustling.  And that would be good for science.