We see study after study of genes 'for' behavioral traits considered to be driven by selection: intelligence, athletic ability, criminality, recklessness, drug abuse, aggression, even being a caring grandmother. The list goes on and on and on. Simplistically stated, the idea is that behavioral traits have a genetic basis, usually a simple 'genetic' one, and that during human evolution, those genetically bestowed with the 'best' version of the trait outcompeted those unlucky enough to be less intelligent, less of a risk taker, a more fearful warrior, and so on. That is pure Darwinian determinism: the bearers of the 'optimal' version of a trait systematically had more offspring, and thus the gene(s) for that version were selected for, and therefore increased in frequency.
This is why, for example, the basis of homosexuality is so curious to evolutionary biologists. How could a behavioral trait that means its bearer does not have offspring ever have evolved? How could a gene persist if it codes for something that interferes with reproduction so the gene isn't passed on? The most common explanation for this is that during the long millennia of human evolution, homosexuals mated and reproduced anyway, because homosexuality was culturally proscribed in the small groups in which humans lived. Maybe that's so, but it's certainly no longer true in many cultures where being gay doesn't have to be hidden anymore, so should we now expect the frequency of homosexuality to fall? Another post hoc account is that homosexuals helped care for their relatives' children, enhancing their extended kinship and hence consistent with natural selection--a technically plausible but basically forced speculative explanation by those who want Darwinian determinism to be as universal as gravity.
In any case, the "cause" of homosexuality is certainly an interesting evolutionary puzzle, if it's assumed to be genetic. It may well not be, of course -- perhaps sexual orientation is influenced by environmental exposures in utero or in infancy. But, let's go with the genetic assumption. Let's even assume that looking for genes for IQ, aggression and so many other behaviors is reasonable, because all these traits, as all traits, must be here because of natural selection.
In that case, it's very curious that there are so many traits that defy Darwinian explanation whose genetic basis isn't being explored. Where are the searches for genes for, say, voluntary celibacy, or use of birth control and non-celibates choosing not to have children, or for suicide, or child-beating, or infanticide, or abortion, or young men volunteering to be soldiers? These are all traits that make no evolutionary sense and shouldn't have evolved, if such traits have a genetic basis. We should be just as perplexed by the evolutionary history of these behaviors as we are by homosexuality. Why aren't we looking for genetic explanations?
I think it's a reflection of cultural values. It's rather akin to environmental epidemiologists never looking for the harmful effects of cauliflower, broccoli or Brussel's sprouts -- instead it's the things we like, our indulgences; alcohol, fatty foods, sugar, which reflect our Puritan scorn for pleasure. I think we notice and think about what seem to us to be unacceptable aberrations, and give much less thought to what seems normal. It's ordinary to us that nuns and priests choose not to reproduce, even if that is completely non-Darwinian, or that suicide bombers are generally of reproductive age and are foregoing having children. Abortion may not be personally acceptable to you, but it's a societal norm. Indeed, artificial birth control itself is highly problematic in a Darwinian world -- even worse for Darwinian theory, it sends women into the work force, away from their children.
Apparently we don't generally notice that these 'normal' behaviors are non-Darwinian -- our primary drive, consciously or unconsciously, but inherently, is supposed to be to perpetuate our genes. If behaviors are genetically driven, selected for, then it's not just homosexuality -- which, until recently, was not socially acceptable -- that doesn't make evolutionary sense, it's any behavior whose primary ramification is not to send our genes into the next generation.
So, don't we have the same issue with explaining the evolutionary origin of all these behaviors as we do explaining homosexuality? Perhaps. But let's consider an explanation that's not generally proffered: Perhaps this is all just statistical 'noise' around a weak rather than precisely or strongly deterministic natural selection, that Nature is just sloppier than the strictly Darwinian view would expect. The success of no species requires that every individual reproduce, so long as enough do. Culture is a powerful force -- once we respond to cultural dictates and norms, the simple evolutionary explanation of selection for optimal (in fitness terms) traits is much less convincing. And, perhaps we didn't evolve to reproduce, just to have orgasms.
And, is there a gene for being dogmatic?
Friday, January 11, 2019
Thursday, January 10, 2019
Too many post-docs! (I wonder why.....)
A recent report, in Nature discusses a glut of post-docs, that is, PhDs who went on to post-doctoral research position(s) with the idea that it was a prep for an actual permanent faculty job, but who then can't find that pot at the end of the promised rainbow.
This has been a growing problem and it has not been a surprise nor closely guarded secret. Nor is the reason any sort of secret. It is due to the academic system that rewards more: more grants, more publications, more citations, more graduate students, more lab activity..... It is the score-countable itemization of faculty worth that has taken over our universities, gradually, almost without our being aware of it, during the past few decades. Universities and their faculty gain their career rewards by satisfying the More-manic criterion that we have allowed to crowd out actual substance from our university culture.
Score-counting has perhaps always been with us to some extent, but nowhere near what it has become. In part, this was the evolution of convenient computer-countability, as well as the push to oust the Old Boy system and to open university careers, promotions, tenure and so on, to make it more fair. This was done, and it was good. However, it was also obvious to administrators (including promotion committees, chairs, deans, and so on) that their careers could be advanced if their bullying of those beneath them on the status totem pole could be seen as 'objective'. This, of course, opened up careers for the for-profit publishing of annual citation-count books, etc., to turn academic life into a score-based kind of game (this started 30 or more years ago, if slowly....).
Now we're in Objectivity's safe, politically correct full-swing, and it's everywhere, polluting the properly more contemplative and abstract nature of serious-quality academic careers. It is, of course, very, very, very good for administrators (as can be seen by their proliferation over recent decades), suppliers of gear, and so on.
Colleges and universities are now deeply into a Malthusian pattern of growth which is reaching, or has reached, the inevitable saturation point. It was foreseeable. Paired with the inability to enforce a mandatory retirement age (also nicely serving the alpha baboons in the system), and universities' need for Teaching Assistants (i.e., a bevy of minimally paid graduate students) so the Professors don't have to sully themselves in classrooms, we seem to be exceeding our academic ecology's capacity, at least if we think in terms of what is fair. We have seen it coming, of course, because we, the faculty, have made it so. Maybe, perhaps hopefully, inevitable retirements and attrition will help, but by how much?
Yes, it is we, the academic System, who have only ourselves to blame.....but why do that!? Reform could harm our cushy careers, after all. Let graduate students beware.....
This has been a growing problem and it has not been a surprise nor closely guarded secret. Nor is the reason any sort of secret. It is due to the academic system that rewards more: more grants, more publications, more citations, more graduate students, more lab activity..... It is the score-countable itemization of faculty worth that has taken over our universities, gradually, almost without our being aware of it, during the past few decades. Universities and their faculty gain their career rewards by satisfying the More-manic criterion that we have allowed to crowd out actual substance from our university culture.
Score-counting has perhaps always been with us to some extent, but nowhere near what it has become. In part, this was the evolution of convenient computer-countability, as well as the push to oust the Old Boy system and to open university careers, promotions, tenure and so on, to make it more fair. This was done, and it was good. However, it was also obvious to administrators (including promotion committees, chairs, deans, and so on) that their careers could be advanced if their bullying of those beneath them on the status totem pole could be seen as 'objective'. This, of course, opened up careers for the for-profit publishing of annual citation-count books, etc., to turn academic life into a score-based kind of game (this started 30 or more years ago, if slowly....).
Now we're in Objectivity's safe, politically correct full-swing, and it's everywhere, polluting the properly more contemplative and abstract nature of serious-quality academic careers. It is, of course, very, very, very good for administrators (as can be seen by their proliferation over recent decades), suppliers of gear, and so on.
Colleges and universities are now deeply into a Malthusian pattern of growth which is reaching, or has reached, the inevitable saturation point. It was foreseeable. Paired with the inability to enforce a mandatory retirement age (also nicely serving the alpha baboons in the system), and universities' need for Teaching Assistants (i.e., a bevy of minimally paid graduate students) so the Professors don't have to sully themselves in classrooms, we seem to be exceeding our academic ecology's capacity, at least if we think in terms of what is fair. We have seen it coming, of course, because we, the faculty, have made it so. Maybe, perhaps hopefully, inevitable retirements and attrition will help, but by how much?
Yes, it is we, the academic System, who have only ourselves to blame.....but why do that!? Reform could harm our cushy careers, after all. Let graduate students beware.....
Tuesday, January 8, 2019
Susumu Ohno: Accounting for Why Gene Counting Doesn't Account for Things
The promise that for nearly two decades has been the main course on the 'omicists' menus, is that by counting--adding up the contributions of a list of enumerated genome locations--all our woes will be gone! The idea is simple: genes are fundamental to life because they code for proteins and stuff like that, which are the basis of life. This, in a nutshell, is the justification for much of the Big Data endeavors being sponsored by the NIH these days, long driven for historical reasons by an obsession with genes.
But, at least partly, this obsession has revealed to us what we should--and could--already have known. Genes are clearly fundamental to life, coding for proteins and other functions. But the reason we're seeing increasing weariness with GWAS and other fiscally high but scientifically low yield approaches is not new. It's not secret. And it is not a surprise. All we needed to do was to ask, where do genes come from? It is not a new question, the genome has been intensively studied, and indeed the answer has been known for nearly 50 (that is, fifty) years.
Susumu Ohno (1928-2000), from Google images) |
So, what did Ohno say?
Where do 'genes' come from?
In his time, we didn't have much in the way of DNA sequencing. We knew that genes coded for proteins, and were located on chromosomes. We had learned a lot about how the code works, much of this from experiments, such as with bacteria. We knew proteins were fundamental building blocks of life, and were strings of amino acids. Watson and Crick and others had shown how DNA carries the relevant code, and so on.
But that did not answer the question: Where do all these genes come from? I'm not an historian, and cannot claim to know the many threads leading to the answer. But in essence, a point Ohno is credited for noting and whose importance he stressed, is that new genes largely arise from duplication events affecting existing genes. He had noticed amino acid similarities among some known proteins (hemoglobins); this and other evidence suggested that chromosomal or individual gene duplication was a mechanism, if not the mechanism, for the origin of new genes. Expecting random mutations in parts of DNA not already being used to code for RNA or DNA, to generate all the sequence aspects of a code for a new protein that would actually have some use, was too far-fetched. Indeed, nowadays one can be skeptical if an 'orphan' gene is claimed--that is, one not part of a gene family, of which there are also other genes in the genome.
Instead, if occasionally a stretch of DNA or even a whole chromosome duplicates, the individual inheriting that expanded genome gains two potentially important attributes. First, s/he has a redundant code; mutational errors in one gene that lead to a non-functional protein can be compensated for by the fact that an entirely different, duplicate gene exists and codes for the same protein.
Secondly, duplication is the basis of a much deeper, indeed fundamental aspect of life, going farther even than just gene: redundancy.
Evolution depends on redundancy: genomes are family affairs
By having redundant genes, the initial result of duplication, an individual is more likely to survive mutations. And over the long haul, with lots of duplication, the additional copies of a needed gene can mutate and over time take on new function, without threat to the individual, who will still have one or more healthy versions of the gene.
Indeed, perhaps one of the far under-appreciated but even fundamental axioms of life is that it is built on redundancy: not only are genomes almost exclusively carriers of members of gene families whose individual genes arose by duplication events, but our tissues themselves are constructed by repeating fundamental units: multicellular organization generally; bilateral or radial symmetry; blood cells, intestinal villi, lobes and alveoli in lungs, nephrons in kidneys, and so on.
I think it is not easy to imagine a different evolutionary way for our very simple biochemical beginnings to generate the kinds of complex organisms that populate the Earth. And this has deep consequences for those for whom dreams of omical sugar plums dance in their heads.
Why the 'omics' promises were always doomed to fail, or at least to pale
From the cell theory to Ohno to the very data that our 'omical dreams have yielded in extensive amounts, we have found that life relies on the protection of redundancy. From genes on up, if one thing goes wrong, there's an ally to pick up the slack. Redundancy means back-ups and alternatives. It also provides individual uniqueness, which is also fundamental to the dynamics evolution.
Together, these facts (and they're facts, not just wild speculations) show that, and why, we can't expect to predict everything from individual genes or even gene scores. There are many roads to the Promised Land.
It is important, I think, and entirely fair to assert that nothing I've said here has ever been secret, known only to a small, Masonic Lodge of biologists exchanging secret handshakes. Indeed, these basic facts have been at the heart of our science since the advent of the cell theory, centuries ago. Genomics has largely just added to what was already known as a generalization about life.
The implicit lesson, of Ohmo not Homer, is to Beware of Geneticists Bearing Gifts.
(updated to correct a spelling error in Prof. Ohno's name)