Saturday, September 21, 2019

Ten Years Writing Mermaid's Tales

It's been a decade since I first posted here. 

It's difficult to know what to write to commemorate my blogoversary. Being a mermaid has been so big, so crucial to my intellectual and creative life for these ten years. And I'm a mushy mess of love and admiration for Ken and Anne. It just can't be boiled down to a blog post

I've been saying that a lot lately. I said it when my friends and I published this paper over the summer and a few sites asked us to write blog posts about it. I encouraged those sites to write them. Me? I can't boil that down to a blog post. Some things need more space than that, things like refutations of culturally ingrained racist analogies need more space than that. And would anyone actually read it? Do people read blog posts anymore? I don't. 

For years we were posting every single Monday, Tuesday, Wednesday, Thursday, and Friday, sometimes twofers. Now people are recording podcasts. People are writing twitter threads. Who knows what else they're doing but it's less blogging. Maybe they found paying gigs. Maybe all the lapsed bloggers had babies.  

Having my son in 2014 meant I couldn't fart around on my laptop at night or on weekends as much as I used to, and certainly not to produce anything coherent like out of the innerworkings of my own brain. Children teach you how to be efficient during normal work hours at your office desk, while you're paying wonderful people to keep your adorable bossbaby away from you, an addict

And then when the stars align and you do post, like this time last fall, and you get kindly reblogged on some high traffic sites, you spend days dealing with the reactions, mostly negative, to the point where Twitter chimes in to say they noticed how overwhelming everything got and here's how to change your notifications settings for the better so maybe you won't delete your account? 

And then, despite the public criticisms of my my cognitive capacity, my as-imagined genitalia, and my writing ability, there were my vivid book dreams. I saw a sabbatical coming, thought I could finally do this no-big-deal lifelong fantasy, and decided to channel my writing energy less into the blog and more into the book. I made some progress. 

But looking back at some of my posts since our heyday here, like the one where I used a hot glue gun (wtf), and like the one where I baited for clicks with the title, "My sexed up Jordan Peterson fantasy," you might guess it was the kind of progress on the kind of book that miiiiiiiiiight be a tough sell not just to a publisher (most of whom want normal with only a spark of creativity so that it can sit on the shelf next to proven successful normal-with-sparks)... but that it might be a tough sell to literally anyone not inhabiting the space between my ears. And you'd be right. 

I lost my agent over it. Scared her away. 

Putting my book project in my Twitter bio actually worked to attract publishers' attention. About a year ago an editor asked to chat with me. How cool! Though I told her I wasn't yet looking for a publisher (not until I'd written the whole thing) and that I seriously doubted a university press would be a good fit for my book, she said I might be surprised. Over the phone I read my book's manifesto, just a few hundred words that I pulled together to capture the spirit of the thing, you know, in case anyone called. 
Somewhere along the way I got fed up with evolution—the version that we’ve been telling for over 150 years now. Maybe it was when it declared my life’s purpose to be survival and reproduction. Maybe it was when it said my body is inferior to a man’s. It was definitely by the time some professor pals of mine were monkeying around and a colleague assumed they were mocking black students. We fucked up. Can we please stop fucking this up?
Evolution is so beautiful and weird and wonderful. Everyone, even if they don’t feel like I do, deserves the chance to. But for too long we’ve alienated people from their own natural history.  We’re long overdue for a human origins that’s fit for all humankind. So, I set out to write one.
At first I assumed I’d have to remove myself entirely from this book. For a most inclusive human evolution, I imagined my author photo as just a handprint. I even toyed with the eponym “Anonymous.” But that’s the dead wrong approach.
Not letting on about how our origins story is written by people, real people, with hearts and souls, sugar and spice, piss and vinegar, people who can’t possibly represent all people is why we still don’t have a story that all people can claim for themselves. The crux of a most inclusive tale is the revelation that diverse approaches to science spawn new stories. Humanity affects how the science is done and how it’s interpreted, how it’s made human. I’ve got to be me to demonstrate that. So, I dropped that arrogant, unrealistic, sciencey urge to be superpersonal and instead got super personal.
It’s time to tell it like it truly is. Human evolution is all about me. And you. We may be strangers but you and I have a story of us. And whether or not we agree on what to emphasize or how to integrate that story into our views of life, our views of love, it is still ours together. And just like you, I am the hero of this tale, one of the inimitable billions and trillions.
The editor's first words were, "You can't curse that early in the book." And then she reminded me that I hadn't actually told her what the book *was*  and could you do that please?

I get it. I'm a Ph.D., I'm an evolution person, and I'm a feminist. You see me and you want the same old human evolution book you've come to expect of Ph.D.s but with a feminist twist. My book is human evolution. My book is feminism. But that thing publishers are thinking is not my book. I had already decided to write the whole thing before actively seeking a publisher and so this painful conversation encouraged me keep doing that. 

I'm not writing a book to be sold. I'm writing a book to write a book that is a book I want to write

But when is there time? You stop blogging and there's suddenly zero additional free time for book writing. You wake up early to write when your brain is on fire and your kid hears you and follows you downstairs and wants to cuddle. You do not write. You cuddle.

The thing about the book dream is ... the writing is the part I relish. The writing itself is the dream. How can I explain. I bawled my eyes out, smiling, while writing my book this summer. This happens to me all the time:
Google: "Joan Wilder crying"
Secondary to that dream is the potential to connect with readers. So dear reader, a book is the most selfish dream in the world for a writer who just loves to write. It will happen, eventually. I'll find the time. I used to have it. I spent it practicing in this lab, right here! And in lots of unposted writing as well. And it will happen sooner rather than later if I can figure out how to stop putting everything I've ever wanted to do into this one book. 

Just because I'll finally have the space to get big, like I don't on this blog, doesn't mean it's big enough.  Like, maybe in this book I won't do that whole teleportation angle/thread/surreal flourish I thought I was kind of pioneering until I read more novels. Maybe I won't make the table of contents a mix tape of tracks that perfectly represent each chapter. Maybe I'll just write what I find myself mucking around in regularly with my students and see how that turns out. And then maybe I'll let that more normal, acceptable book open the publishing door to increasing weirdness, you know, after I've earned it by being less publicly weird first. I think that's how it's done. 

Maybe it surprises you (if you've read this far, which surprises me), that I've got such a writerly perspective on my blogoversary.  A big part of my presence here has been related to my research. But so much of my research is driven by my teaching and teaching is writing. Is that the case for others? It doesn't seem to be a common story. Maybe if you consider mentoring grad students to be "teaching" and the ideas you have create project and directions together. But I mean teaching the fundamentals, teaching textbook stuff, and seeing how it's played out in the public (your students' minds, newspapers, commentaries, cultural movements, medical practices, social media, etc), these experiences have sparked curiosity, consternation, or both which manifests itself in writing, research, and more writing. 

My research has been, and continues to be, driven by questions like, What the heck is a Homo and is it even a scientific question to ask which fossil is the earliest Homo? What the heck is an ape and what were the ape circumstances that preceded the evolution of hominins? Are human babies born early? Why are we born when we’re born? Why are babies helpless? Are women’s hips really “compromised”? What do sex differences in hips have to do with the evolution of bipedalism? Could anybody but a human know that sex makes babies or hold a concept of paternity? Did our ability to reason abstractly about reproduction contribute meaningfully to human evolution? Sex differences in stature and the pelvis occur because of developing reproductive organs, so why are the dominant explanations so narrowly focused on male competition and childbirth? Why is evolutionary thinking perceived to have negative societal implications even by those who accept evolution? What can we do to stop this negative association? How do we untangle racism and sexism from evolutionary thinking?  

Because I think best when writing and even better when I imagine Anne and Ken reading it, I've struggled with those questions here as I struggle with them in writing for "real" publications. So much of my science is scholarship. I have to learn about what's published already just to get some clarity on those questions above before I can even think about endeavoring to contribute my own more science-y, data collection and analysis-style research towards answering these questions. This is how my career evolved out of my Ph.D. and postdocs as I took positions in one (NEIU) and then another (URI) fully undergraduate, teaching focused program. This is how my career evolved as someone who loves to learn by reading and writing. And this is how my career evolved as someone who loves to ask questions (often inspired by students in her classes) that she can't easily answer by doing hands-on science herself.

I think a lot about the time I took a position as coach of the Penn State women's crew team during my first year of grad school. I still had a year of eligibility since rowing at UF, and a good friend and elite rower told me, "They're better off having you as a coach." He has no idea, but at the time this really stung! All I heard was that I wasn't a talented rower. But I was! What he meant was my coaching had the potential to do more for the team than my rowing ever could. He was right about me. Plus, one rower does not make a boat. A coach of all people should know that. 

This memory resurfaces now and again, not just because of how badly I mangled my friend's wise encouragement, but because it feels like a metaphor for my academic life. I love data collection and analysis. Have you seen me in the field? Have you seen me sprint up and down the department halls when I get a statistics thingie to run and to create an accompanying figure that looks half-decent *and* shows support or not for an hypothesis? 

But that's not been my career for a long time. I've been reading all your science and learning so much and it still makes me feel like running. And I run! But I can't help but feel like I'm not on the team anymore. And, worse, I can't help but feel like I'm a bit exasperating for, among other annoying habits, taking words that people publish literally, and also having irritating and seemingly ignorant conceptions of "evolution," for example. These feeling, like I'm not part of the science team anymore, are a bit like the feelings that spurred me to quit coaching after just one year; I wasn't mature enough to handle not being a rower. With research, though, I am mature enough to handle not doing the grant-fueled primary stuff. 

Maybe it's because I'm 42 and I have a small child and I have a 3-3 undergraduate teaching load (with at least four different preps per year) and am facing the same for the next 23 years, so I've not just settled into that reality but I have embraced it. Maybe it's because I realize that I have this not-sane jealousy of computers, that they're having all the fun while we just type on them, and that I not-sanely want little do to with that future which I will begrudgingly admit is now, no now, no now, now, now. Stop it. 

In that context, am I ever going to launch the more science-heavy follow-up to any of my scholarly deep-dives with any new-to-me science-y methods, many if not all requiring some form of futuristic computer whispering? No. I know this about myself. I appreciate the hell out of my collaborators and I can only ask of them so much. And I can't go back to grad school and learn, instead, primate physiology or developmental neuroscience or creative writing (the one that got away... for now). 

Right now I'm doing the things that I think, in my capacity and given my context, have the most potential to make the best contribution: Teaching and writing, and (for both) learning constantly and sharing that with others. If it weren't for the Mermaid's Tale Blog and the connections I've made because of it, I don't know if I could have evolved to be comfortable with this unplanned outcome of my Ph.D. in fossil ape feet, let alone to cherish it like I do. 

I have often wondered, am I a scientist or a biological anthropologist even though I don't do much of the primary data-driven original research anymore that I was trained to do? Am I either of those -ists if I prefer to ask questions that are best, at least for me and for now, answered by reading and writing? Will I morph even further away from those -ists if I ramp up my creative writing?

Like many questions, this is not one that science can answer. Literature, however, always has the answer to a question like that, if you're lucky enough to find it. 

In The Seas by Samantha Hunt, the protagonist, a fellow mermaid and fellow word lover, ponders the many dictionary definitions of "blue" and says,

"If one word can mean so many things at one time. I don't see why I can't."

Friday, September 6, 2019

Epidemiology, biomedical 'causation', and my widow-maker: how to make sense of it, and what terms to use

As I've posted previously, I have been recovering from multiple bypass heart surgery.  I had some angina, vague superficial chest pain that is a symptom of impending heart attack (or, more properly) coronary artery blockage.  Fortunately, I am educated enough to recognize that this chest pain wasn't from my doing a new kind of exercise, I went to the doc and--to make a long story short--was sent right off to the hospital for heart bypass surgery (replacing clogged coronary arteries).

The radiography showed that at least some of my heart arteries were clogged--with whatever radio-opaque goop, presumably including cholesterol, and by whatever clogging mechanism.  These causal facts are, as I understand things, complex and not completely understood, but the upshot was clear: surgery.....or else!

Now, the doctors would say that, given this evidence, I was at high risk of potentially lethal heart disease.  I'm sure had the opportunity been there (and it may be in some future doctor's appointment), I will be chided--or scolded--for my bad diet, too much cholesterol, etc.  It will be assumed that my voluntary lifestyle choices caused my blockage and my need for preventive artery replacement.  Bad boy!  Bad diet! Tsk, tsk, tsk....

But is that right, or might it be the opposite of a more serious truth?

What is bad behavior, health-wise?
I am 77.  This is beyond the usual 76-ish life expectancy for US males (searching the ad-laden web to find data has become mainly a challenge to wade through the relentless commercialism).  So, my lifestyle cannot be viewed as bad behavior in this respect.  Indeed, I have already lived longer than half my birth cohort!  So perhaps my diet and whatever else can, or should be viewed as having been protective.  After all, I was symptom-free until after my expected lifespan.

It is very difficult to understand what 'risk' means in such regards.  If my lifestyle led to my artery becoming clogged up, but it didn't happen until after I'd out-lived my average peer, can I legitimately think of that lifestyle as having been protective rather than risky?  We all have to get some final disorder and some point, so is the absolute cause the relevant fact, or is it the relative?  How can we decide such questions, if indeed they are meaningful ones that can even have meaningful answers?

If my behavior (for whatever reason, including just plain luck) led to my surviving in very good health, except for one weakest-link, then does that link suggest I've behaved badly, or does my overall great state of health suggest the opposite?  More to the point, how can such questions even be answered in a meaningful sense?  They seem meaningful....until you think a bit more carefully about them. . . . .

The philosophical quick-sand doesn't stop there.  If my arterial clog would have led to a relatively quick death--not a 'premature' death at my age!--but saved me from some worse, more prolonged or debilitating fate, can we seriously view that as preventive or protective, with me now facing those dreadful fates?

When we have competing causes and inevitable mortality, we have to view the causes, and what causes them, in a rather different light.  That doesn't mean there are consensus, much less easy, answers.  But it may mean that rules for 'healthy' behavior are not so obvious as they seem to be.

Friday, August 23, 2019

Worse than The Bridge at San Luis Rey!

Thornton Wilder's 1938 book The Bridge of San Luis Rey is a classic with a lesson for today that too many know but are in denial about.  Wilder's book follows the lives of 5 different, generally unrelated people, who for a variety of reasons all end up at a rope bridge over a huge gully in the mountains of Peru.  Unbeknownst to them, they will be on the bridge when it fails, dropping them to their deaths.

Why were they there, a monk asks?  Was there some inevitability about their paths?  Were they connected by some unknown factors?  What, if anything, is the message of, or the lesson from their fates?  Even investigating the possibility of fateful forces cost the monk his life in the dour Church of his time.

But what about our time?  Are we, in a symbolic sense, headed for doom at a bridge that will fail, but to which we seem inevitably driven?  There are reasons to think this could be the case.....

Ecological self-made disaster
The world's natural resources are fragile, destructible, and limited.  In particular, good arable land and other similar resources can be exhausted or destroyed by over-use and mistreatment.  And these factors can be brought on by the demands of an ever-growing population.  Furthermore, human short-term selfish thinking makes it difficult if not impossible for us to restrain ourselves.

In this case, the problem is global warming because of our relentless burning of fossil fuels.  This is made worse by our unconstrained destruction of the Amazon forest and so on, that are the 'lungs' of the earth in this regard.  We pollute the seas, over-fish, and have agricultural habits that destroy the long-term viability of the soils.

Glaciers melt, seas rise, and this inevitably will drive people away from major population centers that, historically, were located by transport media--seas, rivers, and lakes.  If these overflow their banks, massive numbers of people will head inland....but 'inland' is already densely occupied!  And these climate-related changes are well under way and, given the nature of things, their momentum is perhaps unstoppable.

Careless resource consumption and culpable over-population are responsible.  The warning signs have been around for decades (e.g., the widely read book The Population Bomb by Paul Ehrlich in 1968--now 51 years ago).  We have climate-change warners now commonly going around trying to tell people of the dangers (I'll note that even these authors fly all over, contributing needlessly to the problem, to give talks at meetings, autograph their books at booksellers, and so on, though Greta Thunberg is a notable exception).  The hypocrisy, or lack of taking the problem seriously even by those who know better, is rather striking.  Or is it 'frightening' a better word: we simply can't seem to restrain our individual greed and short-term pleasures, even when faced with an impending catastrophe.

To me, this is like passengers enjoying a leisurely, fancy meal, with fine wine, in the dining car of a train that is speeding towards a bridge that's out--as in San Lus Rey.  But unlike the book's characters, we know what the story is.  Yet, even so, we cannot seem to impose the obviously required consumption self-restraint on ourselves.  It's due not only to relentlessly selfish insatiable commercialism or capitalism, but to our own individual, short-term selfishness as well, that we just seem unable to restrain.  Maybe our economic system makes that hard or impossible.  But it doesn't make it not a problem!  We know the issues, but can't seem to react properly.

Can we do anything about it--in time?  Will we?
Your guess is as good as mine.....and perhaps you're more optimistic, less a student of history, and you'll be right.  But based on human history, the odds are against that.  Disastrous societal strain seems inevitable.

Are we fated to meet at San Luis Rey, and plummet together into the rocky abyss?  Wilder's book is worth reading.....and thinking about.  But in our case, the chasm we're headed so rapidly to won't just take out 5 unlucky travelers: it will destroy billions of lives!  And it will do that with slow trauma: famines, wars, displacement, and other disasters as an essential aspect of the process.

Of course, if it pares back the human population to some few million worldwide, the rest of Nature will rejoice, because it will, once again, have a chance to evolve in a more properly orderly way.

Thursday, August 22, 2019

Beauty is bad for physics---and a blemish on life sciences, too!

We naturally like tidy answers to our more fundamental, or even troubling, questions about Nature and existence.  In genetics we want to find the genes 'for' important traits so we can engineer the nasties away--even if we know some other nasty will arise, even if later in life.  We want the cosmos to follow neat, universal rules--to make a universe, after all, that makes sense.  Tidy.  Understandable.

One thoughtful physicist (who now describes herself as 'ex-physicist) is Sabine Hossenfelder, author of a very good book called Lost in Math, and many good articles.  Here is one from a recent online magazine issue.  In this she asks whether the universe is 'beautiful', or whether physicists should expect it to be that way.  She basically says 'no', the universe is the way the universe is, and as scientists we should try to understand it that way--on its own terms, not those of our wishful thinking.  It need not be 'beautiful', as she puts it.

In the life sciences, we are enamored of projects, like physicists' colliders, that are too big and expensive to stop once they have been started--good for the science business, whether or not really as good and efficient a way of investing in science itself.  And, like physicists, we want the solutions to be neat mathematical and useful for predicting--indeed as orderly and 'beautiful' (and universal) as physics sometimes claims to be.  In the human life-science case, we are promised that our DNA sequences can predict disease or other traits, including intelligence and so on.  We want or believe that these traits are somehow built into our genomes, as unfolding (evolving) inevitabilities during our lifetimes, and hence predictable with 'precision' genomics.

It sounds like religious promises of eternal life, does it not?  Indeed, that we know the promises are patently false, at best inaccurate to an unknowable extent since every individual is genomically as well as environmentally unique, doesn't seem to slow down the juggernaut of vested interests exploiting that point of view, which if you think of it closely resembles the threats some churches make when passing the plate to those wanting their souls saved for eternal bliss.

Evolution is in a sense and by its nature very non-beautiful: local, sloppy, inconsistent, and so on.  It has to be that way.  It can't be too predictable.  It depends for its progress on mistakes and flukes--we can think of them as beauty marks some of the time.  But mostly they aren't.  Nonetheless, that is, ironically, the beauty of it all.  That is how complex organisms can evolve in complex environments, and those successes include us!

What beauty!  A Remarkable Unbroken Path
The lack of general predictability of disorders, especially late in life, that we know largely depend on living conditions as well as inherited (and somatically mutating) DNA, is what makes life what it is.  That we must come to an end is not a cheery thought, but were it not for that we'd have had no beginning!

And, if you think of it correctly, we--you, I, and all whom we know--are the so-far eternal descendants of the very origin of life.  There is, for each of us visiting the earth today, an unbroken path back to the origin of life.  What could be more beautiful than that?

So there is a place for beauty.  It is, literally, in the eye of the living beholder, not in a test tube.

Wednesday, August 21, 2019

Locality: life's Newtonian equivalent of 'universality'

In, 1687, in his groundbreaking Principia, Isaac Newton made a fundamental assertion--a fundamental assumption--about reality.  It is that whatever you observe in some location, such as your lab, will apply everywhere else in the cosmos.  He was talking about the nature of the physical universe, following similar views of Galileo, who quipped that the laws of nature were written in the language of mathematics.

Paperback Philosophiae Naturalis Principia Mathematica Book

If the universe was God's creation, its vastness would be of a kind, similar everywhere.  That notion is what stands behind the idea of universal 'laws' of Nature.  The 'laws' were not passed by an Angelic Congress, but are simply God's way of organizing physical existence--all of it!

These laws, discovered by classical physics, are deterministic and universal, not probabilistic, not haphazard.  I don't recall anything about what Newton thought of probabilities other than a debate with Pepys about gambling, but that was not central to his cosmic worldview, which was essentially deterministic--which is what you might expect from an all-knowing God.

But subsequent science has revealed at least two key additions.  First, some of Nature seems inherently probabilistic.  Here we are not restricted to measurement issues, because probabilities seem inherent in quantum-level phenomena.   Secondly,  probabilism means that conditions are at least to some extent contingent on what did happen, relative to what might have happened.  This means that predictions can only be probabilistic at some scale; whether that means the cosmos' states and future are unknowable or not is not for someone like me to opine about!  Whether the probability values are, somehow, fixed, and if so how that is, are interesting and, I think, important questions.

Life is part of the universe, but its fundamental laws are of locality, not universality
Life is a chemical and hence physical phenomenon.  But life in the deeper sense is about how these molecules are organized, and this seems fundamentally different from the kind of molecular evolution seen in stars.  Each star seems, from what one reads, to be similar at the chemical or atomic/subatomic level.

But life evolves by local conditions.  Mutations arise locally and are filtered by selection and chance locally.  Successful mutations and genotypes spread by migration from their source, but are then always subject to genetic and environmental conditions in their new localities.

Locality also is the central fact or organismal organ and system diversity.  In this case, it is local conditions determining gene activation and expression.

But, in a sense prisoners of 19th century physics success, the long shadow of Newton and Galileo, we seem to hunger for laws of life that apply, as the slogan goes, to 'All of us' and that leads to genomic medicine that we are told can be known with some kind of presumably universal 'precision'.  These are, to me, irresponsible misrepresentations made by Francis Collins and NIH, perhaps as their strategy to pry mega-bucks from Congress--because they are naive relative to the biology they promise to understand.

But if we overlook the truth and live in slogans, does this undermine the very science we want to nurture?  The assumption of universality, to me, is fundamentally misleading about how life works and how our genotypes got here.  Whether or how we can use this understanding to develop appropriate medications and so on, is a separate set of very important questions, about which I'm not capable of making judgments....

Tuesday, August 13, 2019

Big Data, Big 'omics-everything....and the sorry state of biomedical research support

People on the federal grant take often speak of 'Big Data' with a nearly lascivious joy.  Big Data is code for biomedical research projects that once begun are too large and costly to terminate, whether or not they did, or do, deliver seriously important truths--truths worthy of their cost.

But when careers depend on it, Big Data and the associated Everything-'omics are as much a fad and ploy for job security for medical school researchers as they have anything to do with science.  Huge open-ended projects need claim nothing that can be rigorously tested, and such fishing expeditions are likely to find at least something that can be blared to the public as major discoveries.  We see it every day, it seems.

One can criticize the Big Hunger for Big Data projects, but they are survival tactics for biomedical researchers, many of whom only get their salaries from external funds, and for those without any actual ideas or means to actually find something profoundly new.  Those 'means' should include freedom from regular bottom-line accountability: hard problems can't be solved on some material-based schedule.  That is because the biological world is complex!

There is, as a rule, not one gene, or even a consistent few genes, that 'cause' traits we care to understand.  Environment--a complex and vague term--interacts with organisms to yield their traits. This is how, via evolutionary processes, we got here in the first place.  Interactions, redundancies, and other complexities have been built into our biology for countless millions of years. Indeed, complexity is protection against vulnerability for survival, and organisms have obviously evolved complexity for that among other reasons (including that redundancy and complexity make room for new innovations to evolve without lethally threatening current systems).  So we should be surprised when we don't find complex redundancy--indeed we have clearly been documenting its universality. And we should stop promising that we'll find the 'genes for' complex traits, like heart disease or obesity etc.

But complexity is not good for the research system
The current research funding system started roughly after WWII, when funds were plentiful and the army of investigators and their staffs and administrators was far smaller.  This is an historical, sociological fact rather than one about the causative nature of the world that we want to understand.

Research used to be much more about solving scientific questions by forming and testing hypotheses, experiment, and so on, than it was about salaries, overhead, and the status of having a Big research group.  But many biological problems are complex--really complex not just as a self-promoting adjective, and to address them should involve patience as well as funding that is stable and doesn't rest on rushing stories to the news media and other means of hyping findings.  Careerism as we see it today is not compatible with this, and the currently gross waste on the fads for Big Fishing Expeditions shows this.

Science today has become pretty much a self-sustaining System, from bureaucrats at all levels to investigators who must sing (i.e., claim Big Results) for their supper.  It sows dishonor and misrepresentation (as well as the occasional desperate scientific fraud).  We all know this but somehow seem powerless to redress things and restore research to the realm of science, where the science is the substance not the window-dressing for fiscal needs as it so often today.  When the system is hyper-competitive, who can afford to confront it?

Things can be changed--even if won't be easy
Reform is never easy, as established conditions are very inertial.  Vested interests, like grant-dependent salaries, must be faded out or even, sometimes, uprooted.  But over recent decades we've clearly allowed those conditions and empires to be built, indeed we often were part of their being built.  But it is time for deep, difficult reform.  We need to fund Big Ideas, Big Efforts at Hard Problems, not just facile sloganized Big Data or 'omics-everything.

But is there the will--and the bravery--for reform?  Where is it, or where can it be nurtured?  The unrest must be stirred to action.

Sunday, August 11, 2019

Who, me?? Why did I clog my 'widow maker'? [on medical cause and effect...and how we know, if we know]

So, having just returned from and now recuperating from coronary bypass surgery, I have to ask the 'complexity' question--a very personal one in this case: Why me? I've lived a physically and physiologically vigorous life.  My diet may not have always been the very best for cardio health (though, for reasons we've discussed here many times over the years, it's not completely clear what that diet should actually be), but it wasn't particularly bad, given what's thought these days to be a "healthy" diet.  

The surgeon who remodeled me at Penn State's fine medical complex in Hershey, said he knows the risk factors in a population but couldn't know why any given individual developed clogged coronary arteries, nor which artery would be affected.  His job was to replace, not explain them, one might say.  So, he didn't even attempt to tell me why I was now in need of bypass surgery.  

As he said, there are five known major risk factors: obesity, unhealthy diet, high cholesterol, genetic predisposition, and smoking. Yes, having diabetes and high blood pressure are risk factors as well, but correlated enough with obesity that perhaps he considers these two conditions to be side effects of obesity.  In any case, these risk factors have been determined by looking at associations between possible causal variables and heart disease in populations.  Resulting statistics describe the population, not identifying specific high-risk individuals within it.  Indeed, some people with heart disease have all the risk factors, some have a combination of a few, and some have none.  And even then, it's not possible to say which was the cause of the disease in most individual cases.   

I have none of these risk factors -- though, I could make up a story.  I smoked when I was young, my father had a pacemaker when he was old, but he lived to 99.  Still, I have done vigorous exercise my whole life, thinking that was my "get cancer program" since it meant, I thought, that I would go not out with a coronary.  What caused my artery to clog?  Indeed, why in my case was the clog in an unstentable artery location, and hence required major surgery? 

This brings up, again, the question of whether one's individual risk can even be known with any sort of 'precision'. Or is that an illusion? Is it a culpably false promise made by the calculating Dr Collins at NIH, to get NIH funding, rather than to give the public a realistic understanding of what we know and what we can hope to know based on research investment of the type he favors?

How, based on current methods of science, can it really be individual? What kind of information would that require, just considering actual, i.e., past effects, assuming they could really be ascertained to any reasonable measurement standard?  What would you need to consider?  Diet, exercise, personality (temperament, for example).  Climate?  Profession? The effects of war, drought, epidemic?  Genes, even?

Of course, the gross and inexcusable BS of promising 'precision genomic medicine' based on very costly, open-ended genomic (and other 'omic) data collection enterprises is culpable.  It is an often openly acknowledged way of getting, and keeping, mega-funding without having any real ideas (and understandable since medical schools culpably don't pay faculty salaries or basic research costs as part of their jobs).  Focused science has chances of finding things out; blind data enumeration, far less so--and what we've done of that so far shows this quite clearly.

We often say 'family history', and clinically this may be the most useful piece of predictive information, but what does that actually explain?  Did Dad or Uncle Jane have the same trait because of genes, or because of their shared family habits and lifestyles?  How could you really tell?  A surgeon need not care, as their job is to fix the clogged pipes, and if heart disease runs in a family the physician will treat his or her patient as high risk.  Still, to prevent this sort of thing, we need to know what causes it. 

This is a central biomedical question!  It is hard enough to know, much less accurately measure, all factors in life that might in this or that way be a 'risk' factor for a given disease, like clogged coronary plumbing.  Is it a delusion to think we could identify, much less measure all the factors?  If, as seems obvious, there isn't just a single factor, and probably everyone's exposure set is different (and their effects need not be 'additive'), how on earth can we even know how well we are measuring, or ascertaining, such factors?

And, if we can do this, it only applies directly to current cases and their past lifestyle exposures.  But what we would like to do, for individuals and for public health, is to predict the future to lower risks.  However, there is no way, not even in principle, no reasonable chance of knowing what future exposures will be, not even for populations.  Diets and lifestyles change in ways we cannot predict, nor can we predict major future events--climate, war, pestilence, food types and availability, etc., that would be highly relevant.

So what should we do with our understanding of these unpredictable factors?  Perhaps just level with patients and the public, and stop using the public to endow a particular, and particularly costly, part of the university research empire.  Maybe a return to focused, hypothesis--based research--actual science--in my view.

Friday, August 9, 2019

Laura! Laura! Please don't do this!

I was, for the first time ever, a hospital inpatient a couple of weeks ago.  As it turned out, because I knew what angina pectoris is, I went to the docs, and they sent me to the hospital where, to make a long story short, I had bypass surgery.  It's not been easy, but at least I had not waited too long!

I seem to be doing fine, given all that.  I hope so; but not everyone in the hospital is as lucky.

In one part of my odyssey, I shared a room with an older man, annoyingly having Fox "News" at loud volume, though he was relatively insensate and not at all watching that bigoted claptrap.  He moaned a lot, and, in his confusion, repeatedly called for his daughter  "Laura!  Laura!  Don't do this!  Please, oh, please, don't do this!"

In my time sharing that room, I overheard the doctor telling the man's relatives, quietly, that his leukemia was not responding to treatment.  The man himself clearly was not being told this, and/or was far beyond comprehension.  At one point, the ashen old man was wheeled out by a nurse, to another room to leave me some peace and quiet (I think the nurses realized the impact of his relentless moaning on me).  Then, a few hours later, I was off to my own adventures--heart bypass surgery.

These kinds of experiences are, I guess, in store for most of us, as we meander through the savage realities of aging, even if it is because of the medical care system, our well-off lifestyles, hospitals and treatments that we can have those experiences at all.  It is not easy to think about, especially when one's self is involved, not just relatives or neighbors or somebody else, or newspaper stories or science-journal statistics.

I am not sure what to write about in this context.  One wants treatments, and hospitals are fine locations where the treatments can be obtained.  Does one also want to know the whole truth about what the doctors find?  Who should be told?  Who should just be given various sorts of comforting (even when or if false) words?  Who decides?

I hope that, this time at least, I am on the road to some sort of comfort and success, even though at my age (late 70s), it is obvious that the future can't only hold good experiences.  In the hospital, one is forced to see the future, and the approaching end.  This is not a new thought, and we all know life, indeed, life on earth and even the earth itself, are temporary.  But it is a thought to keep in mind, today, and each day.

And the lesson, or my message, for readers of this post is: value today, value those you care about and who care about you.  All are precious!  All are temporary.....

Thursday, July 11, 2019

Human races are not like dog breeds

If you googled for information comparing race in humans to dog breeds, we wrote this open access, peer-reviewed article specifically for you. There is even a glossary at the end of the main text, and some of the sources we cite are also open access. Thanks for reading.

Are human races like dog breeds? No.
Are human races the same as dog breeds? No.
Are human races just like dog breeds? No.
Are human races basically dog breeds? No.


Friday, May 10, 2019

The music of life--more than a collection of notes

My composer friend wants to be quite modern about creating beautiful music.  He doesn't like to use computer programs for composing but he has devised another 'modern' way to compose, given that, in writing a piece, he often changes his mind.  Scratching out notes on paper to replace them with 'better' ones makes for a real mess on the working pages, and he'd then have to transcribe his work onto new pages, and that in itself introduces room for mistakes.  So he had an idea.

He purchased a set of notes and musical symbols, printed individually on a kind of flexible plastic.  Copies of each possible note and notation element were in boxes in a little tray.  As he composed, he merely took each required note from its place in the tray, and used its static electricity to place it on a page with printed staff-lines.  If he changed his mind, it was easy to remove or replace a given note, and put it back in its box in the tray without generating an inky mess on the page and having to keep starting over to make his work-in-progress legible.  

But there turned out to be a serious, indeed even tragic, problem.  He liked working in his studio, right in front of a window giving him an inspiring view of his garden.  But, after days of work composing a comparably ethereal and beautiful piece, a gust blew through the window, riffled the pages, and shook all the notes off the page and onto the table!  What a scattered mess!  And what a heartbreaking loss of all that work!

Of course, you could say that the composition with all its beauty was in some sense still there, right before him: all the required notes were indeed still there--every one.  But they were in a pile, no longer with any order from which he could reconstruct the composition just by picking the notes up and placing them back on the page.  So, it was literally all there---but none of what mattered was!

As my composer friend told me this story, it occurred to me that this was analogous to the 'pile' of DNA letters (As, Cs, Gs, and Ts) that is found by sequencing people with and without some trait, like a disease.  The letters differ greatly among individuals with the 'same' trait, because they don't have the trait for the same genetic reason.  And the sampled individuals' genomes vary in literally countless ways that have nothing to do with the disease.  Unlike the score, the 'letters' are still in their original order, but genes don't make a score as far as we are concerned because, unlike an orchestra, we don't know how to 'play' them!

In a sense, each person we see who is playing the same tune, so to speak, is doing so from a different score.  Some shared notes may be involved, but they are all jumbled up with shared, and not-shared, notes that have nothing to do with the tune.

And yet we are widely promised, and widely being trephined to pay for, the idea that looking through the jumble of genetic 'notes' we can predict just about anything you can name about each individual's traits.

Indeed, unlike the composer's problem, there are all sorts of notes that are not even visible to us (they are called 'somatic mutations').  We yearn for a health-giving genomic 'tune', which is a very natural way to feel, but we are unable (or, at least, unwilling) to face the music of genomic reality.

And, of course, this mega-scale 'omics 'research' is all justified with great vigor by NIH, as if it is on the very verge of discovering fundamental findings that will lead to miraculous cures, indeed cures for 'All of us'.  At what point is it justified to refer to it as a kind of culpable fraud, a public con job?

By our bigger, bigger, bigger approach, we have entrenched 'composers' trying to read scores that are to a great extent unreadable in the way being attempted.  We are so intense at this, like rows of monks transcribing sacred manuscripts in a remote monastery, that we are committed to something that we basically have every legitimate good reason to know isn't the way things are.

Thursday, April 18, 2019

Brains, not brawn, for college!

It has long been a secret--not!--that American football is not compatible with having any brains left to do college work.  Now there is yet another story, in the New York Times, this time about this in regard to the University of Colorado's football brain-injured.  This sport is as savage as the Roman Coliseum 'sports' were two thousand years ago, and, yes, humans may be slow learners, but that is far too long for us to get the message.

We here at Penn State have the world's third largest football stadium, a grand stage on which to observe brain damage (not to mention various other breaks and bruises) of our 'students'.  Of course, some of these players actually are students in a serious rather than technical sense of the term.  How many leave here with fewer IQ points than when they came, is not known.  At least some do major in actual college-level subjects, and many are very fine students (as I can say from direct personal experience).

But it is time to change, NFL or not.  Let those who want to gladiate for money in the NFL get their brain-damaging preparatory experience elsewhere.  We are supposed to be universities, places of classroom and lab learning, not brute brain bashing.  Football may have been safer decades ago before training methods improved to make these guys huge monsters in size and strength.  It's not their fault, of course, but ours--the adults at universities.  We brought this about, and there is one reason: we wanted money from attendees, alums, TV networks, and so on.  But  universities should not operate on the greed metric, but should stand for something higher, something better.

Indeed, we can have it both ways:  If we moved soccer 'football' to the stadiums, there would be a lot of grumbling from alumni, and maybe a few years of lower donations (mainly to the athletic department, one can surmise) and lower beer and hot dog sales, but eventually they'd all be back, cheering their lungs out for the Nittany Lion soccer team.  And they could have many more games--and for men and women--in a season.  It would eventually pay out.  Well, TV revenues might drop a lot for a while, but if other actual 'universities' followed suit, everything would recover, except the players.  They would not have to recover, since they'd have far fewer injuries (and protective headbands could be used to protect from damage during headers).  And they could take more, and more substantial, college courses while doing this.

It's worth thinking about, for those readers who still have their brains intact to do such a thing.

Wednesday, April 3, 2019

Copy cat!! How 'bourgeois' we've become!

In the sad way that science has become ever more bourgeois, Nature, itself now largely a checkout- counter mag, has a feature editorial on plagiarism (p 435, 28 March 2019).  The author, Debora Weber-Wulff, seems to specialize in sleuthing academic verbal cheaters, as if it is a new profession in itself.  She goes over the various software developed to detect plagiarism in professional and student papers, and evaluates them and the detection problem itself.  Commercial, profiteering, competing software--more than one--to detect academic cheaters!

The commentary mentions strategies that authors use to get multiple pubs on the same subject, and even seems to suggest that publishing an article from or part of your doctoral dissertation is a kind of plagiarism (who in recent memory has searched for or found, much less read doctoral theses after the defense?).

And now, the most bourgeois thing of all, in my opinion, is that there are conferences on academic integrity, and even they have their own plagiarism as the author relates!  And as part of the new class system even in esoteric academia, she notes that those that were detected were "demoted" to mere posters.  Surely I've mis-read this commentary.  Surely!

Somehow, this seems just another routine story about academic life.  Since it's basically gossipy, it takes place of honor in Nature.  It's a kind of 'how-it's-done' review, as if cheating was as common as, say, making espresso.  Can you imagine that such a thing would largely have been unthought of not many decades ago?  It's true.  I was there (and I didn't plagiarize!).

There must always have been some plagiarism, since there are always rogues.  There have long been rewards to publish much the same paper in several different places, to reach different audiences in the days before web-searching.  But it was likely much easier to detect real plagiarism, which was doubtlessly far less prevalent in the old days.  At least that was my experience in my particular old days.  There was no need for competing companies to profiteer by selling plagiarism-detecting programs!  That almost institutionalizes cheating as a cat-and-mouse part of modern careerism, and a commentary like Weber-Wulff's that describes plagiaristic ploys almost helps one do it!

One, if not the main, reason for this situation is that far less was being published, less often, in fewer journals, and by far fewer players in a given academic arena.  The players were much better known to each other, far fewer published more than an article now and then, and most readers knew the relevant literature (and each other). The pace was less.  The Malthusian academic overpopulation didn't exist, so the competition was less (even if there were of course Big Egos competing).  Publishers were mainly non-profit, careers less intensely grant-dependent (and grants were easier to get).  The competition was more about ideas and actual substantive impact, and far less about academic score-counting (citations, publication counts, impact factors).  Less pressure to survive, and less pressure to cheat.  No first-semester graduate seminars on 'grantsmanship'.

....and no need for Nature to  have a feature commentary on how to catch academic cheaters.

Just after this was posted, a commentary appeared in Nature on a major academic fraud case:

If we really want to encourage honor and honesty in science, we need to look not at the science but at the science culture, the money-driven, competitive, frenetic area--and Nature and its proliferating for-profit satellite publications is a culpable part of the problem.  We need to cool down the temperature of the research industry.  But to me that requires reducing the amount of selfish gain available--to investigators, journals, universities, equipment suppliers--the academic-industrial complex to pick up on Dwight Eisenhower's long-ago warning about military's similar excesses.

But where is the will to do this?

Monday, March 25, 2019

Human Genome Diversity: important to recognize, but not a new issue

A couple of decades ago, several of us, led by Luca Cavalli-Sforza, Marc Feldman, Ken Kidd, and several others (including yours truly), got together to suggest a worldwide sampling of human genetic diversity that would specifically include the diverse 'anthropological' populations (traditional tribal groups who still existed but were being surrounded or incorporated -- or worse -- by the growing, large agricultural/industrial civilizations.  The idea, called the Human Genome Diversity Project (HGDP), was to collect DNA samples from hundreds of populations worldwide who would otherwise be un- or under-represented in the available data on human genomic variation.  The large agricultural/industrial populations are swamping (if not literally exterminating) these more ethnically aboriginal peoples.  Yet their pattern of genomic diversity is that from which the dense populations derived, and the latters' variation may tell us about the origins and nature, and perhaps adaptive fitness, interactions and so on of the larger pan-human population into which 'we' grew.

The idea of a global HDGP was stifled by two things.  One was attacks by political opportunists (and played culpably by the media) who felt this global sampling was demeaning to the aboriginal populations or would be designed imperialistically to profit from those peoples by patenting findings; and secondly, by the hungry economic maw of the human genome sequencing project then in progress and preemptive.

The upshot was that the HGDP was never funded.  Luca donated the set of global samples then available to him, to the France-based CEPH website, where they were given the HGDP name (that is still the case, though I think it wrong, because the data set was not a systematically global design-first-then-sample project, so it rather co-opted the HGDP name).  Nonetheless, and to the good, the DNA along with analytic results from those samples are freely available to qualified researchers.

Another HGDP organizer, Ken Kidd at Yale (along with his wife, Judy, and other collaborators), has produced an excellent, publicly accessible website called ALFRED, which provides allele frequency data from populations around the world, plus documentation of the sampled population and a variety of other user-friendly features.  Among other things, this is a fine tool for teaching global human diversity,

Now, a new paper by Sarah Tishkoff and others (Sirugo et al., "The Missing Diversity in Human Genetic Studies", in Cell 177, March 21, 2019) makes the case for sampling human genomic diversity, of a sort, pointing out various reasons why it would be good to address the current bias in genetics towards Europeans with global sampling of human variation.  Obviously, I agree with that although many technical points could be raised about whether the inevitably smaller samples from scattered small populations could possibly be analyzed as effectively as the very large samples required to identify risk variants that are being over-peddled to us via the various 'omics and Big Data advocates.

What are the 'populations' and what does 'diversity' properly include?
The value, potential and humane importance of properly sampling humans beyond the major large populations in Europe and North America is obvious, but the new paper makes the case mainly for the larger 'mainline' populations other than Europeans.  Unfortunately, though even they be numerous in the census sense, they are heterogeneous and it is unclear who, exactly, and how, current data represent them.  Can we just blithely say we need to include 'Africans' to address the representativeness problem?  Are, for example, African-Americans, not to mention 'Hispanic-Americans' all the same among possible samples?  And the same regarding Asians. The current paper deals with these issues at least to some extent.  But then what about, say, New Zealand natives, or Cherokees, other small populations, or which castes and from which parts of India must we collect data? How exhaustive should we sample and how can complex genomes effectively be parsed in this way (not to mention environments--a topic at least acknowledged by Sirugo et al.).

Francis Collins' current 'All of Us' sloganeering is, to me, a culpable mis-representation to the public, a strategy to ply huge funds out of Congress in open-ended ways, too big to terminate, a welfare project for university research and their various supporting industries and interests.  The idea seems to be implicit, though unjustified, that any sort of open-ended Big Data 'omical project can be fair to small sub-groups (indeed, I would argue from various aspects of what we know already, it can't for the major ethnic groups either).  So what does the promise that this is for 'All of us' actually mean, beyond transparent strategy to pry open-ended funding from Congress?

Problems with the promise in the first place
Now while I agree that increasing sampling of human diversity is important for many reasons, not least being fairness, the paper promises that it will increase or improve 'precision' medicine.  To me, that is sloganeering, and avoids facing up to what Big Data 'omics have already shown us about causal complexity of the important non-Mendelian traits--complexity not only in the genomic but also environmental senses.

There are several obvious, but obviously conveniently ignored reasons for this.  First, 'genetic' causation involves more than inherited genomic variation.  Important variation arises during life, when cells divide.  This somatic variation is genetic, but not sampled in the usual genome-sequencing way.  Yet somatic variation clearly has important consequences because, a cell doesn't 'know' if its genome sequences were inherited from the individual's parents, or arose during the individual's life.

Secondly, the whole enterprise assumes that induction can lead to deduction, that is, that what we've observed in the past leads us to predict the future.  It is not just inherited and somatic mutations whose future is literally unpredictable, but the same is true for lifestyle exposures.  Yet lifestyle exposures are vital components of complex disease risks.  They cannot be predicted, even in principle.  That means past exposures do not predict future ones (to environments or mutations).  This is not a dark secret, no matter how inconvenient for the 'omics prediction industries.  Unlike many areas in chemistry and physics, induction does not lead to deduction in life.

What we need is deep re-thinking of the problem of genomic effects on disease and other traits.  But that is not easy to arrange when careers and institutions depend on very large, very predictable, basically permanent funding is needed for the persons involved.  To improve these aspects of our science, we need a different way to support it, new economics, not bigger data or more sequencing.  and a side benefit of such reform, were it ever possible, would be to free up investigators' minds from surviving to surmising--new ideas.

Our "I'm first!!" era in science
I do have to note that the tendency to ignore, or be ignorant of, prior work is manifest in this paper, which does not mention the HGDP.  We are in an "I'm first!" era in science.  I think Shakespeare understood the clearer truth: 'What is past, is prologue'.

Good ideas need to be followed up, and properly sampling the world is one such good idea.  But this paper doesn't really deal with the small, traditional aboriginal populations.  In the case of the HGDP effort, there was simply a lack of support for sampling small, relatively isolated populations to build a picture of human genomic diversity out of the context from which it actually arose.  But it was an effort that explicitly recognized the issues, as they stood at that time.  So it is not excusable that the new paper fails to acknowledge the precedent advocating worldwide population sampling.  The senior author was very familiar with that effort.

A good idea, that should not seem novel, would be for scientists to read, and cite, their predecessors who had prior recognition of an issue or problem and inevitably, even if indirectly, are leads to stimulating subsequent work.  But crediting others doesn't help one's career score-counting, and it takes at least a tad of effort to find out what an ideas' ancestors may have thought, not to mention crediting them.  In this case, the senior author had every reason indeed to know directly about this history.  Indeed, she did her doctoral and post-doctoral work in places deeply involved in the HGPD!

Anyway, this griping aside, it is at least worth discussing in a serious way whether and how a global sampling of worldwide populations, beyond the main 'racial' groups, would be a good thing to do.  I think it would.  We are, after all, throwing away countless millions (or is it billions?) on proudly hypothesis-free Big Data 'omical enumerations, projects too big to stop (no matter how, by now, largely pointless). We now know the basic landscape, and it is not nearly as encouraging as its self-interested press regularly blares.  Its valuable results should stimulate hard, new thinking, but as long as business as usual pays and absorbs careers, who knows when that will happen?

Even if reform is difficult because of vested interests that we've allowed to develop, it is proper to acknowledge one's intellectual ancestors.

Tuesday, March 5, 2019

Tales for children (and lessons for scientists, of all ages)

How the Gene got its Family
Reported by Ken Weiss, Penn State University

NOTE:  The following “Just So” story was found in the posthumous papers of the late Rudyard Kipling, apparently intended to explain to his young readers how genomes got their repetitive structure and why that protects us.

Now, O Best Beloved, I’ll tell how Snake Gene came all spotted, safe from Mongoose Mutant’s fangs, like Leopard in the dappled shadows of the forest floor!  Once, ever so long, long ago, Gene lay alone in the deep dark dense nuclear forest. Fearing Mutant, Gene longed for a family to keep him safe in the wild woods. He looked at himself, so long, long, and lithe, and had an idea!  “What I need to do is duplicate!”

Bending and twisting, snaky Gene coiled so snuggly that when he uncoiled he saw he had made another of his kind!  And this he did again, and again, ‘til he exclaimed “O My! We’re a family--the Genomes!”  The new family nestled warmly together, curling and coiling, curling in the deep dark forest! 

And they took heart:  When Mongoose next came hunting, hungry, Beloved, he saw a wriggling ‘scape of dazzling spots, each a Gene, as elusive as the morning mist.  Mutant kept snapping, snapping, but his prey seemed always here and there: if he bit one, others took its place, and yet others.  ‘Aaah!’, cried Mutant, ‘I hunger for my prey, but my bite can’t bring it down.’

And Lo!, seeing this from his perch on a nearby tree, sage Owl passed the word of Genome’s victory all forest-wide, and each who heard it followed suit.  They duplicate and duplicate and protect themselves from O so per’lous Mutant’s fangs that seek their end!  One day, even People heard the news, and learned how Mutant met his match.

The Law of Life’s dense, deep-dark, dank dang’rous jungle is: Safety rests in duplication’s many paths to the same end.  We call that Ree-dundancy!

But then, you may wonder, "If they are so protected, why does any beast of the forest ever take ill?”  Ah, Beloved, it is good that you ask!  Each time Mutant snaps, he can bite one or even more of the Genes.  Such a small meal from so large a family, so that usually nothing bad happens.  But sometimes, after many bites hurt ever so deeply, they may even kill!  Yes, a law of the tricky dark jungle is that each time, different Genes are bitten. There isn’t just one way Mutant gains his meals!  The Genomes are a big family, and most bites don’t hurt much.  But, when Mutant is lucky, sometimes, so sadly, he bites enough to bring the victim down.  The heavy weight of guilt can't fall on one poor Gene and say he is the cause.  It is a failure of the family.  That is a law of the Jungle.