Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Tuesday, August 13, 2019

Big Data, Big 'omics-everything....and the sorry state of biomedical research support

People on the federal grant take often speak of 'Big Data' with a nearly lascivious joy.  Big Data is code for biomedical research projects that once begun are too large and costly to terminate, whether or not they did, or do, deliver seriously important truths--truths worthy of their cost.

But when careers depend on it, Big Data and the associated Everything-'omics are as much a fad and ploy for job security for medical school researchers as they have anything to do with science.  Huge open-ended projects need claim nothing that can be rigorously tested, and such fishing expeditions are likely to find at least something that can be blared to the public as major discoveries.  We see it every day, it seems.

One can criticize the Big Hunger for Big Data projects, but they are survival tactics for biomedical researchers, many of whom only get their salaries from external funds, and for those without any actual ideas or means to actually find something profoundly new.  Those 'means' should include freedom from regular bottom-line accountability: hard problems can't be solved on some material-based schedule.  That is because the biological world is complex!

There is, as a rule, not one gene, or even a consistent few genes, that 'cause' traits we care to understand.  Environment--a complex and vague term--interacts with organisms to yield their traits. This is how, via evolutionary processes, we got here in the first place.  Interactions, redundancies, and other complexities have been built into our biology for countless millions of years. Indeed, complexity is protection against vulnerability for survival, and organisms have obviously evolved complexity for that among other reasons (including that redundancy and complexity make room for new innovations to evolve without lethally threatening current systems).  So we should be surprised when we don't find complex redundancy--indeed we have clearly been documenting its universality. And we should stop promising that we'll find the 'genes for' complex traits, like heart disease or obesity etc.

But complexity is not good for the research system
The current research funding system started roughly after WWII, when funds were plentiful and the army of investigators and their staffs and administrators was far smaller.  This is an historical, sociological fact rather than one about the causative nature of the world that we want to understand.

Research used to be much more about solving scientific questions by forming and testing hypotheses, experiment, and so on, than it was about salaries, overhead, and the status of having a Big research group.  But many biological problems are complex--really complex not just as a self-promoting adjective, and to address them should involve patience as well as funding that is stable and doesn't rest on rushing stories to the news media and other means of hyping findings.  Careerism as we see it today is not compatible with this, and the currently gross waste on the fads for Big Fishing Expeditions shows this.

Science today has become pretty much a self-sustaining System, from bureaucrats at all levels to investigators who must sing (i.e., claim Big Results) for their supper.  It sows dishonor and misrepresentation (as well as the occasional desperate scientific fraud).  We all know this but somehow seem powerless to redress things and restore research to the realm of science, where the science is the substance not the window-dressing for fiscal needs as it so often today.  When the system is hyper-competitive, who can afford to confront it?

Things can be changed--even if won't be easy
Reform is never easy, as established conditions are very inertial.  Vested interests, like grant-dependent salaries, must be faded out or even, sometimes, uprooted.  But over recent decades we've clearly allowed those conditions and empires to be built, indeed we often were part of their being built.  But it is time for deep, difficult reform.  We need to fund Big Ideas, Big Efforts at Hard Problems, not just facile sloganized Big Data or 'omics-everything.

But is there the will--and the bravery--for reform?  Where is it, or where can it be nurtured?  The unrest must be stirred to action.

Monday, February 11, 2019

Mea culpa, mice!

Animal research is purportedly protected by university and broader review and approval criteria.  The relevant IACUC (Institutional Animal Care and Use Committee) approval committees meet regularly, go over research and/or grant applications their faculty wish to submit for funding to the feds, and so on, and approve them for humane handling and other criteria; approval is required before the work can go on. 

Approval?  Well, what I have seen over many years, is not that 'anything goes', but lots of bending to allow scientists to act like Frankensteins so long as the victims--and that's what they are--can't protest.  We define what is 'humane' for them.  That makes it 'ethical' research.  And, no surprise, it does not require that whatever be approved is something we would voluntarily undergo ourselves (for knowledge's sake!).  Well, maybe the Nazis did that....

I am writing this to express my own very deep regrets for my many years of research on mice.  Mice were almost my only victims (no guinea pigs, etc., baboons once or twice briefly), but they were many.  And they did not have to sign any informed consent!  We did to them what our purportedly protecting IACUC approved, and deemed 'humane' and 'justified' for the new knowledge we would gain.  Note the 'we' here.  This was entirely selfish.

A lab mouse (source: cdn9wn.com)

And, dare one ask: what fraction of the knowledge gained by animal research is really of any substantial value to humans (none, of course, to the animals' own species)?  By what 'right' do we enslave and torment (if not, often, horrify and torture) innocent animals, to build our careers?  Because it is score-counting for our own selfish interest that is part of the story, even if, of course, that story also includes the desire for genuine new knowledge, which we hope will lead to improvements of some kind (for us).

Well, they're 'just animals', we rationalize (I think, a rationale conveniently provided by Descartes, who judged all but we soul-bearers to be no more than autonomous machines).  IACUC restrictions mean that we don't torture them (well, we pretend so, at least, using our not their definition; and we certainly do do things to them we'd be jailed, or even executed, for doing to humans).  Animal research is, after all, for our own good (note, again, the 'our' in this excuse).

What is the reality of the IACUC protection system?  It may avert the worst tortures, but it rationalizes much that is so gruesome that if the public knew of it......  For example, we can 'humanely' make transgenic mice who suffer--even from conception--some serious physical, health or behavioral defect, by investing them with, say, some known serious gene defect so that, hopefully, 'we' can figure out how to fix it (in humans, not in the victims from whom we learned the tactic).  I have been present at a purported meeting about research ethics, where a prominent university official bemoaned the care (already minimal, really, from the victims' viewpoint) that his IACUC committee exercised.  Why?  Because, he said, their Committee sometimes turned down faculty proposals that might, if funded, have brought in significant overhead funds to the University.  I mean, really!

A clear statement of the problem, for those humane enough to listen
There is a poignant, indeed deeply disquieting article in The Atlantic ("Scientists Are Totally Rethinking Animal Cognition", Ross Andersen, March, 2019).  Animals of all sorts, including even invertebrates, have self-awareness of one kind or another, essentially a sense of 'me'.  They presumably generally don't know about the inevitable finiteness of life, including their lives, so are saved from at least some of the abstract fears and fearful knowledge we humans may uniquely understand.  But they are not just things, cellular machines, and they do have fears, experience pain, and so on.  And yet.....

And yet, we noble university professors, not to mention those working in industry and agriculture, do to animals what we would not do to ourselves or each other (well, about what we do to each other one can say much that is just as sad as my reflections on animal research here).

So, to the mice (and those insects I've knowingly destroyed, and countless animals that I've eaten), I offer my mea culpa!  Sometimes, one must kill.  Life is an evolutionary phenomenon most of whose actors must, because of their evolutionary history, dine on objects of similar makeup.  Meat is one form, but do we too easily dismiss plants?  There are recent reports showing that plants are far more social, sentient, and aware than is convenient for us to think about with ease, as we munch away on our daily salad or fruit and veggies.

Darwin saw in his way what is to blame.  Life evolved as a self-renewing chemical phenomenon, and species evolved to dine on each other because, in a sense, we're all made of the same stuff.  It is a cruel truth of living existence, and that is one reason Darwin's work was controversial and is still resisted by those who wish for comforting theological accounts of reality and a joyful forever-after.  But we know--even they know--some of the harsh realities of life here and now.

We researchers always have some sort of justifying rationale for what we do to animals.  We have the approval of a committee, after all!  We're doing it for the good of humanity, to understand life, or for some other self-advancing careerist reason, including bringing in money to the university.  But, the bottom line is: we do, in fact, do it.

Fare thee well...
So, you countless mice, whose lives I terminated so I could get ahead, even if doing so as 'humanely' as possible after selfishly using them as I did, here's my apology.  I can't take back what I did to you, all within our acceptable research standards (note, mice, I said 'our', not 'your' acceptable standards).  Even where I could find, or construct, some rationale for my work, such as health-related discovery, basic knowledge, and so on, the payoff for us is small, and the cost to you, mice, was total and involuntary.  And one can debate how valuable the knowledge really was in the grand scheme of things.  How many of these must suffer before even one serious benefit, even one just for humankind, be gained?   I, and my lab group over the years, didn't really 'need' to do it, except  mainly for our careers. Was it enough that I did, after all, give you life, some sort of short life at least?

I so wish there were some way I could make it up to you.

Wednesday, November 28, 2018

Induction-deduction, and replicability: is there any difference?

In what sense--what scientific sense--does the future resemble the past?  Or perhaps, to what extent does it?  Can we know?  If we can't, then what credence for future prediction can we give to results of studies today, necessarily from the past experience of current samples?  Similarly, in what sense can we extrapolate findings on this sample to some other sample or population?  If these questions are not easily answerable (indeed if they are answerable at all!), then much of current, and currently very widespread and expensive science, is at best of unclear, questionable value.

We can look at these issues in terms of a couple of standard aspects of science: the relationship between induction and deduction; and the idea of replicability.  Induction and deduction basically come from the Enlightenment time in western history, when it was found in a formal sense that the world of western science--which at that time meant physical science--followed universal 'laws' of Nature.  At that time, life itself was generally excluded from this view, not least because it was believed to be the result of ad hoc creation events by God.

The induction--deduction problem
-----------------
Some terminology:  I will make an important distinction between two terms.  By induction I mean drawing a conclusion from specific observed data (e.g., estimating some presumed causal parameter's value).  Essentially, this means inferring a conclusion from the past, from events that have already occurred. But often what we want to do is to predict the future.  We do that, often implicitly, by equating observed past values as estimates of causal parameters, that apply generally and therefore to the future; I refer to that predictive process, derived from observed data, as deduction.  So, for example, if I flip a coin 10 times and get 5 Heads, I assume that this is somehow built into the very nature of coin-flipping so that the probability of Heads on any future flip is 0.5 (50%).
-----------------

If we can assume that induction implies deduction, then what we observe in our present or past observations will persist so that we can predict it in the future.  In a law-like universe, if we are sampling properly, this will occur and we generally assume this means with complete precision if we had perfect measurement (here I speculate, but I think that quantum phenomena at the appropriate scale have the same universally parametric properties).

Promises like 'precision genomic medicine', which I think amount to culpably public deceptions, effectively equate induction with deduction: we observe some genomic elements associated in some statistical way with some outcome, and assume that the same genome scores will similarly predict the future of people decades from now.  There is no serious justification for this assumption at present, nor quantification of by how much there might be errors in assuming the predictive power of past observations, in part because mutations and lifestyle clearly have major effects, but especially because these are unpredictable--even in principle.  Indeed, there is another, much deeper problem of a similar kind, that has gotten recent--but to me often quite naive attention: replicability.

The replicability problem
Studies, perhaps especially in social and behavioral fields, report findings that others cannot replicate.  This is being interpreted as suggesting that (ignoring the rare outright fraud), there is some problem with our decision-making criteria, other forms of bias, or poor study designs.  Otherwise, shouldn't studies of the same question agree?  There has been a call for the investigators involved to improve their statistical analysis (i.e., keep buying the same software!! but use it better), report negative results, and so on.

But this is potentially, and I think fundamentally, naive.  It assumes that such study results should be replicable.  It assumes, as I would put it, that at the level of interest, life = physics.  This is, I believe not just wrong but fundamentally so.

The assumption of replicability is not really different from equating induction to deduction, except in some subtle way applied to a more diverse set of conditions.  Induction of genomic-based disease risk is done on a population like, say, case-control samples, and then applied to the same population in terms of its current members' future disease risks.  But we know very well that different genotypes are found in different populations, so it is not clear what degree of predictability we should, or can, assume.

Replicability is similar except that in general a result is assumed to apply across populations or samples, not just to the same sample's future.  That is, I think, an even broader assumption than the genomics-precision promise that does, at least nominally, now recognize population differences.

The real, the deeper problem is that we have absolutely no reason to expect any particular degree of replicability between samples for these kinds of things.  Evolution is about variation, locally responsive and temporary, and that applies to social behavior as well.  We know that 'distance' or difference accumulates (generally) gradually over time and separation as a property of cultural as well as biological evolution.  The same obviously applies even more to psychological and sociological samples and inferences from them.

Not only is it silly to think that samples of, say, this year's college seniors at X University will respond to questionnaires in the same way as samples of some other class or university or beyond.  Of course, college students come cheap to researchers, and they're convenient.  But they are not 'representative' in the replicability sense except by some sort of rather profound assumption.  This is obvious, yet it is a tacit concept of very much research (biological, psychological, and sociological).

Even social scientists acknowledge the local and temporary nature of many of the things they investigate, because the latter are affected by cultural and historical patterns, fads, fashions, and so much more.  Indeed, the idea of replicability is to me curious to begin with.  Thus, a study that fails to replicate some other study may not reflect failings in either, and the idea that we should replicate in this kind of way is a carryover of physics envy.  Perhaps in many situations, a replication result is what should be examined most closely! The social and even biological realms are simply not as 'Newtonian', or law-like, as is the real physical realm in which our notions of science--especially the very idea of a law-like replicability, arose. Not only is failure to replicate not necessarily suspect at all, but replicability should not generally be assumed.  Or, put an other way, a claim that replicability is to be expected is a strong claim about Nature that requires very strong evidence!

This raises the very deep problem that in the absence of replicability assumptions, we don't know what to expect of the next study, after we've done the first.....or is this a justification for just keeping the same studies going (and funded) indefinitely?  That's of course the very rewarding game being played in genomics.

Friday, October 19, 2018

Nyah, nyah! My study's bigger than your study!!

It looks like a food-fight at the Precision Corral!  Maybe the Big Data era is over!  That's because what we really seem to need (of course) is even bigger GWAS or other sorts of enumerative (or EnumerOmics studies, because then (and only then) will we really realize how complex traits are caused, so that we can produce 'precision' genomic medicine to cure all that ails us.  After all, there no such thing as enough 'data' or a big (and open-ended) enough study.  Of course, because so much knowledge....er, money....is at stake, such a food-fight is not just children in a sand box, but purported adults, scientists even, wanting more money from you, the taxpayer (what else?).  The contest will never end on its own.  It will have to be ended from the outside, in one way or another, because it is predatory: it takes resources away from what might be focused, limited, but actually successful problem-solving research.

The idea that we need larger and larger GWAS studies, not to mention almost any other kind of 'omics enumerative study, reflects the deeper idea that we have no idea what to do with what we've got.  The easiest word to say is "more", because that keeps the fiscal flood gates open.  Just as preachers keep the plate full by promising redemption in the future--a future that, like an oasis to desert trekkers, can be a mirage never reached, scientists are modern preachers who've learned the tricks of the trade.  And, of course, since each group wants its flood gates to stay wide open it must resist any even faint suggestion that somebody else's gates might open wider.

There is a kind of desperate defense, as well as food fight, over the situation.  This, at least, is one way to view a recent exchange between an assertion by Boyle et al. (Cell 169(7):1177-86, 2018**)  that some few key genes perhaps with rare alleles scattered across the genome are the 'core' genes responsible for complex diseases, but that lesser often indirect or incidental genes across the genome provide other pathways to affect a trait, and are detected in GWAS.  If a focus on this model were to take place, it might threaten the gravy train of more traditional, more mindless, Big Data chasing. As a plea to avoid that is Wray et al.'s falsely polite spitball in return (Cell 173:1573-80, 2018**)  urging that things really are spread all over the genome, differently so in everyone.  Thus, of course, the really true answer is some statistical prediction method, after we have more and even larger studies.

Could it be, possibly, that this is at its root merely a defense of large statistical data bases and Big Data per se, expressed as if it were a legitimate debate about biological causation?  Could it be that for vested interests, if you have a well-funded hammer everything can be presented as if it were a nail (or, rather, a bucket's worth of nails, scattered all over the place)?

Am I being snide here? 
Yes, of course. I'm not the Ultimate Authority to adjudicate about who's right, or what metric to use, or how many genome sites, in which individuals, can dance on the head of the same 'omics trait.  But I'm not just being snide.  One reason is that both the Boyle and Wray papers are right, as I'll explain.

The arguments seem in essence to assert that complex traits are due either to many genetic variants strewn across the genome, or to a few rare larger-effect alleles here and there complemented by nearby variants that may involve indirect pathways to the 'main' genes, and that these are scattered across the genome ('omnigenic').  Or that we can tinker with GWAS results and various technical measurements from them to get the real truth?

We are chasing our tails these days in an endless-seeming circle to see who can do the biggest and most detailed enumerative study, to find the most and tiniest of effects, with the most open-ended largesse, while Rome burns.  Rome, here, are the victims of the many diseases which might be studied with actual positive therapeutic results by more, focused, if smaller, studies.  Or, in many cases, by a real effort at revealing and ameliorating the lifestyle exposures that typically, one might say overwhelmingly, are responsible for common diseases.

If, sadly, it were to turn out that there is no more integrative way, other than add-'em-up, by which genetic variants cause or predispose to disease, then at least we should know that and spend our research resources elsewhere, where they might do good for someone other than universities.  I actually happen to think that life is more integratively orderly than its effects typically being enumeratively additive, and that more thoughtful approaches, indeed reflecting findings of the decades of GWAS data, might lead to better understanding of complex traits.  But this seemingly can't be achieved by just sampling extensively enough to estimate 'interactions'.  The interactions may, and I think probably, have higher-level structure that can be addressed in other ways.

But if not, if these traits are as they seem, and there is no such simplifying understanding to be had, then let's come clean to the public and invest our resources in other ways to improve our lives before these additive trivia add up to our ends when those supporting the work tire of exaggerated promises.

Our scientific system, that we collectively let grow like mushrooms because it was good for our self interests, puts us in a situation where we must sing for our supper (often literally, if investigators' salary depends on grants).  No one can be surprised at the cacophony of top-of-the-voice arias ("Me-me-meeeee!").  Human systems can't be perfect, but they can be perfected.  At some point, perhaps we'll start doing that.  If it happens, it will only partly reflect the particular scientific issues at issue, because it's mainly about the underlying system itself.


**NOTE: We provide links to sources, but, yep, they are paywalled --unless you just want to see the abstract or have access to an academic library.  If you have the looney idea that as a taxpayer you have already paid for this research so private selling of its results should be illegal--sorry!--that's not our society.

Thursday, October 18, 2018

When is a consistent account in science good enough?

We often want our accounts in science to be consistent with the facts.  Even if we can't explain all the current facts, we can always hope to say, truthfully, that our knowledge is imperfect but our current theory is at least largely true....or something close to that....until some new 'paradigm' replaces it.

It is also only natural to sneer at our forebears' primitive ideas, of which we, naturally, now know much better.  Flat earth?  Garden of Eden?  Phlebotomy?  Phlogiston?  Four humors?  Prester John, the mysterious Eastern Emperoro who will come to our rescue?  I mean, really!  Who could ever have believed such nonsense?

Prester John to the rescue (from Br Library--see Wikipedia entry)
In fact, leaders among our forebears accepted these and much else like it, took them as real, sought them for solace from life's cares not just because they were promised (as in religious figures) but as earthly answers.  Or, to seem impressively knowledgeable, found arcane ways to say "I dunno" without admitting it.  And, similarly, many used ad hoc 'explanations' for personal gain--as self-proclaimed gurus, promisers of relief from life's sorrows or medical woes (usually, if you cross their palms with silver first).

Even in my lifetime in science, I've seen forced after-the-fact 'explanations' of facts, and the way a genuine new insight can show how wrong those explanations were, because the new insight accounts for them more naturally or in terms of some other new facts, forces, or ideas.  Continental drift was one that had just come along in my graduate school days.  Evolution, relativity, and quantum mechanics are archetypes of really new ideas that transformed how our forebears had explained what is now our field of endeavor.

Such lore, and our more broad lionizing of leading political, artistic or other similarly transformative figures, organizes how we think.  In many ways it gives us a mythology, or ethnology, that leads us to order success into a hierarchy of brilliant insights.  This, in turn, and in our careerist society, provides an image to yearn for, a paradigm to justify our jobs, indeed our lives, make them meaningful--make them important in some cosmic sense, and really worth living.

Indeed, even ordinary figures from our parents, to the police, generals, teachers, and politicians have various levels of aura as idols or savior figures, who provide comforting answers to life's discomfiting questions.  It is natural for those burdened by worrisome questions to seek soothing answers.

But of course, all is temporary (unless you believe in eternal heavenly bliss).  Even if we truly believe we've made transformative discoveries or something like that during our lives, we know all is eventually dust.  In the bluntest possible sense, we know that the Earth will some day destruct and all our atoms scatter to form other cosmic structures.

But we live here and now and perhaps because we know all is temporary, many want to get theirs now, and we all must get at least some now--a salary to put food on the table at the very least.  And in an imperfect and sometimes frightening world, we want the comfort of experts who promise relief from life's material ills as much as preachers promise ultimate relief.  This is the mystique often given to, or taken by, medical professionals and other authority figures.  This is what 'precision genomic medicine' was designed, consciously or possibly just otherwise, to serve.

And we are in the age of science, the one True field (we seem to claim) that delivers only objectively true goods; but are we really very different from those in similar positions of other sorts of lore?  Is 'omics any different from other omnibus beliefs-du-jour?  Or do today's various 'omical incantations and promises of perfection (called 'precision') reveal that we are, after all, even in the age of science, only human and not much different from our typically patronized benighted forebears?

Suppose we acknowledge that the latter is, at least to a considerable extent, part of our truth.  Is there a way that we can better use, or better allocate, resources to make them more objectively dedicated to solving the actually soluble problems of life--for the public everyday good, and perhaps less used, as from past to today, to guild the thrones of those making the promises of eternal bliss?

Or does sociology, of science or any other aspect of human life, tell us that this is, simply, the way things are?

Wednesday, October 17, 2018

The maelstrom of science publishing: once you've read it, when should you shred it?

There is so much being published in the science literature--a veritable tsunami of results.  New journals are being started almost monthly, it seems, and mainly or only by for-profit companies.  There seems to be a Malthusian growth of the number of scientists, which has certainly produced a genuine explosion of research and knowledge, but the intense pressure on scientists to publish has perhaps changed the relative value of every paper.

And as I look at the ancient papers (that is, ones from 2016-17) that I've saved in my Must-Read folder, I see all sorts of things that, if they had actually been widely read, much less heeded, would mean that many papers being published today might not seem so original.  At least, new work might better reflect what we already know--or should know if we cared about or read that ancient literature.

At least I think that, satire aside, in the rush to publish what's truly new, as well as for professional score-counting and so on, and with the proliferating plethora of journals, the past is no longer prologue (sorry, Shakespeare!) as it once was and, one can argue should still be.  The past is just the past; it doesn't seem to pay to recognize, much less to heed it, except for strategic citation-in-passing reasons and because bibliography software can be used to winnow out citable papers so that reviewers of papers or grant applications won't be negative because their work wasn't cited.  You can judge for yourself whether this is being realistic or too cynical (perhaps both)!

The flux of science publishing is enormous for many reasons.  Not least is the expansion in the number of scientists.  But this is exacerbated by careerist score-counting criteria that have been growing like the proverbial Topsy in recent decades: the drive to get grants, big and bigger, long and longer.  Often in biomedical sciences, at least, grants must include investigator salaries, so there is massive self-interest in enumerable 'productivity'.  The journals proliferate to fill this market, and of course to fill the coffers of the publishers' self-interest.  Too cynical?

Over the years, in part to deflate Old Boy networks, 'objective' criteria have come to include, besides grants garnered, a faculty member's number of papers, ranking of the journals they're in, citation counts, and other 'impact factor' measures.  This grew in some ways also to feed the growing marketeering by vendors, even who provide score-counting tools, and university bureaucracies.  More generally, it reflects the way middle-class life, the life most of us now lead, has become--attempts to earn status, praise, wealth, and so on by something measurable and therefore ostensibly objectiveToo cynical?  

Indeed, it is now common for graduate students--or even undergrads--to attend careerism seminars.  Instruction in how to get published, how to get funded, how to work the System.  This may be good in a sense, or at least realistic, even if it was not so when, long ago, I was a graduate student.  It does, however, put strategizing rather than science up front, a first-year learning priority.  One wonders how much time is lost that, in those bad old days, was spent thinking and learning about the science itself.  We were, for example, to spend our 2-year Master's program learning our field, only then to get into a lab and do original work, which was what a PhD was about.  It is fair to ask whether this is just a change in our means of being and doing, without effect on the science itself, or whether careerism is displacing or even replacing really creative science?  When is objection to change nothing more than nostalgic cynicism?

Is science more seriously 'productive' than it used to be?
Science journals have always been characterized largely by the minutiae they publish, because (besides old boy-ism) real, meaty, important results are hard to come by.  Most observation in the past, and experiment these days, yields little more than curios.  You can see this by browsing decades-old volumes even of the major science journals.  The reports may be factually correct, but of minimal import.  Even though science has become a big industry rather than the idle rich's curiosity, most science publishing now, as in the past, might more or less still be vanity publishing.  Yet, as science has become more of a profession, there are important advances, so it is not clear whether science is now more splash than substance than it was in the past.

So, even if science has become an institutionalized, established, middle-class industry, and most of us will go down and out, basically unknown in the history of our fields, that has probably always been the case.  Any other view probably is mainly retrospective selective bias: we read biographies of our forebears, making them seem few and far between, and all substantial heroes; but what we are reading is about those forebears who really did make a difference.  The odd beetle collector is lost to history (except maybe to historians, who themselves may be making their livings on arcane minutiae).  So if that's just reality, there is no need to sneer cynically at it.

More time and energy are taken up playing today's game than was the case, or was necessary, in the past--at least I think that is pretty clear, if impossible to prove.  Even in the chaff-cloud, lasting knowledge does seem to be much more per year than it used to be.  That seems real, but it reveals another reality.  We can only deal with so much.  With countless papers published weekly, indeed many of them reviews (so we don't have to bother reading the primary papers), overload is quick and can be overwhelming.

That may be cynical, but it's also a reality.  My Must-Read folder on my computer is simply over-stuffed, with perhaps a hundred or more papers that I 'Saved' every year.  When I went to try to clean my directory this morning, I was overwhelmed: what papers before, say, 2015 are still trustworthy, as reports or even as reviews of then-recent work?  Can one even take reviews seriously, or cite them or past primary papers without revealing one's out-of-dateness?  New work obviously can obsolesce prior reviews. Yet reviews make the flood of prior work at least partially manageable.  But would it be safer just to Google the subject if it might affect one's work today?  It is, at least, not just cynicism to ask.

Maybe to be safe, given this situation, there would be two solutions:
1.  Just Google the subject and get the most recent papers and reviews; 
2.  There should be software that detects and automatically shreds papers in a Science Download directory, that haven't had any measurable impact in, say, 5 or (to be generous) 10 years.  We already have sites like Reddit, whose contents may not have a doomsday eraser.  But in science, to have mercy on our minds and our hard discs, what we need is Shred-it!

Tuesday, October 16, 2018

Where has all the thinking gone....long time passing?

Where did we get the idea that our entire nature, not just our embryological development, but everything else, was pre-programmed by our genome?  After all, the very essence of Homo sapiens compared to all other species, is that we use culture--language, tools, etc.--to do our business rather than just our physical biology.  In a serious sense, we evolved to be free of our bodies, our genes made us freer from our genes than most if not all other species! And we evolved to live long enough to learn--language, technology, etc.--in order to live our thus-long lives.

Yet isn't an assumption of pre-programming the only assumption by which anyone could legitimately promise 'precision' genomic medicine?  Of course, Mendel's work, adopted by human geneticists over a century ago, allowed great progress in understanding how genes lead at least to the simpler of our traits, with discrete (yes/no) manifestations, traits that do include many diseases that really, perhaps surprisingly, do behave in Mendelian fashion, and for which concepts like dominance and recessiveness been applied and that, sometimes, at least approximately hold up to closer scrutiny.

Even 100 years ago, agricultural and other geneticists who could do experiments, largely confirmed the extension of Mendel to continuously varying traits, like blood pressure or height.  They reasoned that many genes (whatever they were, which was unknown at the time) contributed individually small effects.  If each gene had two states in the usual Aa/AA/aa classroom example sense, but there were countless such genes, their joint action could approximate continuously varying traits whose measure was, say, the number of A alleles in an individual.  This view was also consistent with the observed correlation of trait measure with kinship-degree among relatives.  This history has been thoroughly documented.  But there are some bits, important bits, missing, especially when it comes to the fervor for Big Data 'omics analysis of human diseases and other traits.  In essence, we are still, a century later, conceptual prisoners of Mendel.

'Omics over the top: key questions generally ignored
Let us take GWAS (genomewide association studies) on their face value.  GWAS find countless 'hits', sites of whatever sort across the genome whose variation affects variation in WhateverTrait you choose to map (everything simply must be 'genomic' or some other 'omic, no?).  WhateverTrait varies because every subject in your study has a different combination of contributing alleles.  Somewhat resembling classical Mendelian recessiveness, contributing alleles are found in cases as well as controls (or across the measured range of quantitative traits like stature or blood pressure), where the measured trait reflects how many A's one has: WhateverTrait is essentially the sum of A's in 'cases', which may be interpreted as a risk--some sort of 'probability' rather than certainty--of having been affected or of having the measured trait value.

We usually treat risk as a 'probability,' a single value, p, that applies to everyone with the same genotype.  Here, of course, no two subjects have exactly the same genotype so some sort of aggregate risk score, adding up each person's 'hits', is assigned a p.  This, however, tacitly assumes something like that each site contributes some fixed risk or 'probability' of affection.  But this treats these values as if they were essential to the site, each thus acting as a parameter of risk.  That is, sites are treated as a kind of fixed value or, one might say 'force', relative to the trait measure in question.

One obvious and serious issue is that these are necessarily estimated from past data, that is, by induction from samples.  Not only is there sampling variation that usually is only crudely estimated by some standard statistical variation-related measure, but we know that the picture will be at least somewhat different in any other sample we might have chosen, not to mention other populations; and those who are actually candid about what they are doing know very well that the same people living in a different place or time would have different risks for the same trait.

No study is perfect, so we use some conveniently assumed well-behaved regression/correction adjustments to account for the statistical 'noise' due to factors like age, sex, and unmeasured environmental effects.  Much worse than these issues, there are clearly factors of imprecision, and the obvious major one, taboo even to think about much less to mention, that relevant future factors (mutations, environments, lifestyles) are unknowable, even in principle.  So what we really do, are forced to do, is extend what the past was like to the assumed future.  But besides this, we don't count somatic changes (mutation arising in body tissues during life, that were not inherited), because they'd mess up our assertions of 'precision', and we can't measure them well in any case (so just shut one's eyes and pretend the ghost isn't in the house!).

All of these together mean that we are estimating risks from imperfect existing samples and past life-experience, but treating them as underlying parameters so that we can extend them to future samples.  What that does is equate induction with deduction, assuming the past is rigorously parametric and will be the same in the future;  but this is simply scientifically and epistemologically wrong, no matter how inconvenient it is to acknowledge this.  Mutations, genotypes, and environments of the future are simply unpredictable, even in principle.

None of this is a secret, or new discovery, in any way.  What it is, is inconvenient truth. These things should have been enough, by themselves and without badgering investigators about environmental factors that (we know very well, typically predominate) prevent all the NIH's precision promises from being accurate ('precise'), or even to a knowable degree.   Yet this 'precision' sloganeering is being, sheepishly, aped all over the country by all sorts of groups who don't think for themselves and/or who go along lest they get left off the funding gravy train.  This is the 'omics fad.  If you think I am being too cynical, just look at what's being said, done, published, and claimed.

These are, to me, deep flaws in the way the GWAS and other 'omics industries, very well-heeled, are operating these days, to pick the public's pocket (pharma may, slowly, be awakening-- Lancet editorial, "UK life science research: time to burst the biomedical bubble," Lancet 392:187, 2018).  But scientists need jobs and salaries, and if we put people in a position where they have to sing in this way for their supper, what else can you expect of them?

Unfortunately, there are much more serious problems with the science, and they have to do with the point-cause thinking on which all of this is based.

Even a point-cause must act through some process
By far most of the traits, disease or otherwise, that are being GWAS'ed and 'omicked these days, at substantial public expense, are treated as if the mapped 'causes' are point causes.  If there are n causes, and a person has an unlucky set m out of many possible sets, one adds 'em up and predicts that person will have the target trait.  And there is much that is ignored, assumed, or wishfully hidden in this 'will'.  It is not clear how many authors treat it, tacitly, as a probability vs a certainty, because no two people in a sample have the same genotype and all we know is that they are 'affected' or 'unaffected'.

The genomics industry promises, essentially, that from conception onward, your DNA sequence will predict your diseases, even if only in the form of some 'risk'; the latter is usually a probability and despite the guise of 'precision' it can, of course, be adjusted as we learn more.  For example, it must be adjusted for age, and usually other variables.  Thus, we need ever larger and more and longer-lasting samples.  This alone should steer people away from being profiteered by DNA testing companies.  But that snipe aside, what does this risk or 'probability' actually mean?

Among other things, those candid enough to admit it know that environmental and lifestyle factors have a role, interacting with the genotype if not, usually, overwhelming it, meaning, for example, that the genotype only confers some, often modest, risk probability, the actual risk much more affected by lifestyle factors, most of which are not measured or not measured with accuracy, or not even yet identified.  And usually there is some aspect that relates to age, or some assumption about what 'lifetime' risk means.  Whose lifetime?

Aspects of such a 'probability'
There are interesting issues, longstanding issues, about these probabilities, even if we assume they have some kind of meaning.  Why do so many important diseases, like cancers, only arise at some advanced age?  How can a genomic 'risk' be so delayed and so different among people?  Why are mice, with very similar genotypes to humans (which is why we do experiments on them to learn about human disease) only live to 3 while we live to our 70s and beyond?

Richard Peto, raised some of these questions many decades ago.  But they were never really addressed, even in an era when NIH et al were spending much money on 'aging' research including studies of lifespan.  There were generic theories that suggested from an evolutionary theory why some diseases were deferred to later ages (it is called 'negative pleiotropy'), but nobody tried seriously to explain why that was from a molecular/genetic point of view.  Why do mice only live only 3 years, anyway?  And so on.

These are old questions and very deep ones but they have not been answered and, generally, are conveniently forgotten--because, one might argue, they are inconvenient.

If a GWAS score increases the risk of a disease, that has a long delayed onset pattern, often striking late in life, and highly variable among individuals or over time, what sort of 'cause' is that genotype?  What is it that takes decades for the genes to affect the person?  There are a number of plausible answers, but they get very little attention at least in part because that stands in the way of the vested interests of entrenched too-big-to-kill Big Data faddish 'research' that demands instant promises to the public it is trephining for support.  If the major reason is lifestyle factors, then the very delayed onset should be taken as persuasive evidence that the genotype is, in fact, by itself not a very powerful predictor.

Why would the additive effects of some combination of GWAS hits lead to disease risk?  That is, in our complex nature why would each gene's effects be independent of each other contributor?  In fact, mapping studies usually show evidence that other things, such as interactions are important--but they are at present almost impossibly complex to be understood.

Does each combination of genome-wide variants have a separate age-onset pattern, and if not, why not?  And if so, how does the age effect work (especially if not due to person-years of exposure to the truly determining factors of lifestyle)?  If such factors are at play, how can we really know, since we never see the same genotype twice? How can we assume that the time-relationship with each suspect genetic variant will be similar among samples or in the future?  Is the disease due to post-natal somatic mutation, in which case why make predictions based on the purported constitutive genotypes of GWAS samples?

Obviously, if long delayed onset patterns are due not to genetic but to lifestyle exposures interacting with genotypes, then perhaps lifestyle exposures should be the health-related target, not exotic genomic interventions.  Of course, the value of genome-based prediction clearly depends on environmental/lifestyle exposures, and the future of these exposure is obviously unknowable (as we clearly do know from seeing how unpredictable past exposures have affected today's disease patterns).

The point here is that our reliance on genotypes is a very convenient way of keeping busy, bringing in the salaries, but not facing up to the much more challenging issues that the easy one (run lots of data through DNA sequencers) can't address.  I did not invent these points, and it is hard to believe that at least the more capable and less me-too scientists don't clearly know them, if quietly.  Indeed, I know this from direct experience.  Yes, scientists are fallible, vain, and we're only human.  But of all human endeavors, science should be based on honesty because we have to rely on trust of each other's work.

The scientific problems are profound and not easily solved, and not soluble in a hurry.  But much of the problem comes from the funding and careerist system that shackles us.  This is the deeper explanation in many ways.  The  paint on the House of Science is the science itself, but it is the House that supports that paint that is the real problem.

A civically responsible science community, and its governmental supporters, should be freed from the iron chains of relentless Big Data for their survival, and start thinking, seriously, about the questions that their very efforts over the past 20 years, on trait after trait, in population after population, and yes, with Big Data, have clearly revealed.

Saturday, October 6, 2018

And yet it moves....our GWAScopes and Galileo's lesson on reality

In 1633, Galileo Galilei was forced to recant before the Pope his ideas about the movement of the Earth, or else to face the most awful penalty.  As I understand the story, he did recant....but after leaving the Cathedral, he stomped his foot on the ground, and declared "And yet it moves!"  For various reasons, usually reflecting their own selfish vested interests, the powers that be in human society frequently stifle unwelcome truths, truths that would threaten their privileged well-being.  It was nothing new in Galileo's time--and it's still prevalent today.


Galileo: see Wikipedia "And yet it moves"
All human endeavors are in some ways captives of current modes of thinking--world-views, beliefs,  power and economic structures, levels of knowledge, and explanatory frameworks.  Religions and social systems often, or perhaps typically, constrain thinking. They provide comforting answers and explanations, and people feel threatened by those not adhering, not like us in their views.  The rejection of heresy applies far beyond formal religion.  Dissenters or non-believers are part of 'them' rather than 'us', a potential threat, and it is thus common if not natural to distrust, exclude, or even persecute them.

At the same time, the world is as the world really is, especially when it comes to the physical Nature.  And that is the subject of science and scientific knowledge.  We are always limited by current knowledge, of course, and history has shown how deeply that can depend on technology, as Galileo's experience with the telescope exemplifies.

When you look through a telescope . . . . 
In Galileo's time, it was generally thought or perhaps believed is a better word, that the cosmos was God's creation as known by biblical authority.  It was created in the proverbial Genesis way, and the earth--with we humans on it--was the special center of that creation.  The crystal spheres bearing the stars and planets, circled around and ennobled us with their divine light.  In the west, at least, this was not just the view, it was what had (with few exceptions) seemed right since the ancients.

But knowledge is often, if not perhaps always, limited by our senses, and they in turn are limited by our sensory technology.  Here, the classical example is the invention of the telescope, and eventually, what that cranky thinker Galileo saw through it.  Before his time, we had we had our naked eyes to see the sun move, and the stars seemed quite plausibly to be crystal spheres bearing twinkles of light, rotating around us.

If you don't know the story, Wikipedia or many other sources can be consulted. But it was dramatic!  Galileo's experience taught science a revolutionary lesson about reality vs myth and, very directly, about the importance of technology in our understanding of the world we live in.

The lesson from Galileo was that when you look through a telescope you are supposed to change your mind about what is out there in Nature.  The telescope lets you see what's really there--even if it's not what you wanted to see, or thought you'd see, or would be most convenient for you to see.


Galileo's telescope (imagined).  source: news.nationalgeographic.com
From Mendel's eyes to ours
Ever since antiquity, plant and animal breeders empirically knew about inheritance, that is, about the physical similarities between parents and offspring.  Choose parents with the most desirable traits, and their offspring will have those traits, at least, so to speak, on average.  But how does that work?

Mendel heard lectures in Vienna that gave him some notion of the particulate nature of matter.  When, in trying to improve agricultural yields, he noticed discrete differences, he decided do test their nature in pea plants which he knew about and were manageable subjects of experiments to understand the Molecular Laws of Life (my phrase, not his).

Analogies are never perfect, but we might say that Mendel's picking discrete, manageable traits was like pre-Newtonians looking at stars but not at what controlled their motion.  Mendel got an idea of how parents and offspring could resemble each other in distinct traits.   In a similar way that a telescope was the instrument that allowed Galileo to see the cosmos better, and do more observing than guessing, geneticists got their Galilean equivalent, in genomewide mapping (GWAS), which allowed us to do less guessing about inheritance and to see it better.  We got our GWAScope!

But what have we done with our new toy?   We have been mesmerized by gene-gazing.  Like Galileo's contemporaries who, finally accepting that what he saw really was there and not just an artifact of the new instrument, gazed through their telescopes and listed off this and that finding, we are on a grand scale just enumerating, enumerating, and enumerating.  We even boast about it.  We build our careers on it.

That me-too effort is not surprising nor unprecedented.  But it is also become what Kuhn called 'normal science'.  It is butting our heads upon a wall.  It is doing more and more of the same, without realizing that what we see is what's there, but we're not explaining it.  From early in the 20th century we had quantitative genetics theory--the theory that agricultural breeders have used in formal ways for that century, making traditional breeding that had been around since the discovery of agriculture, more formalized and empirically rigorous.  But we didn't have the direct genetic 'proof' that the theory was correct.  Now we do, and we have it in spades.

We are spinning wheels and spending wealth on simple gene-gazing.  It's time, it's high time, for some new insight to take us beyond what our GWAScopes can see, digesting and understanding what our gene-gazing has clearly shown.

Unfortunately, at present we have an 'omics Establishment that is as entrenched, for reasons we've often discussed here on MT, as the Church was for explanations of Truth in Galileo's time.  It is now time for us to go beyond gene-gazing.  GWAScopes have given us the insight--but who will have the insight to lead the way?

Wednesday, October 3, 2018

In order to be recognized, you have to be read: an impish truth?

Edgar Allan Poe was an American short story writer, a master of macabre horror--the 3 G's, one might say: Grim, Gruesome, and Ghastly.  Eeeeek!! If you don't know Poe, a BBC World Service podcast in the series The Forum (Sept 15, 2018) discusses his life and work.  If you haven't yet, you should read him (but not too late at night or in too dark a room!).  The Tell-tale Heart, The Murders in the Rue Morgue, The Pit and the Pendulum, and The Cask of Amontillado should be enough to scare the wits out of you! Eeeeek!!

Edgar Allan Poe (1809-49)
Ah, scare tactics--what a ploy for attention!  At a time when not many people were supporting themselves with writing alone, Poe apparently wrote that this going over the top was justified or even necessary if you wanted to make a living as a writer.  If you have to sell stories, somebody has to know about them, be intrigued by what they promise, go out and buy them.

Is science also a fantasy horror?
Poe was referring to his use of extreme shock value in literature, stories of the unreal.  But a colleague in genetics once boasted that "anything worth saying is worth exaggerating, and worth repeating", and drum-beating essentially the same idea over and over is a common science publishing policy.   This attitude seems schemingly antithetical to the ideals of science which should, at least, be incompatible with showmanship for many reasons.

Explaining science and advocating one's view in responsible ways is part of education, and of course the public whose taxes support science has a right to know what scientists do with the money.  New ideas may need to be stressed against a recalcitrant public, or even scientific, community.  Nonetheless, pandering science to the public as a ploy to get attention or money from them, is unworthy.  At the very least, it temps exaggeration or other misrepresentations of what's actually known.  We regularly see the evidence of this in terms of outright fraud that is discovered, and also yes-no-yes-again results (does coffee help or hurt you?).

This, I think, reflects a gradual, subtle, but for someone paying attention, a substantial dumbing-down of science reporting, by even the mainstream news media--even the covers and news 'n views headlines of the major science journals approach checkout-counter magazines in this, in my view.  Is this only crass but superficial pandering for reader and viewership--for subscription sales, or could it reflect a serious, degeneration in the quality of education itself, on which our society so heavily relies?   Eeeeek!!

In fact, showman scientists aren't new.  In a way, Hippocrates (whoever he was, if any single individual) once wrote a defensive article (On the Sacred Disease) in explicit competition for 'control' of the business of treating epilepsy, an effort to maintain that territory for medicine against competition from religion.  Centuries later, Galen was apparently well-known for public demonstrations of vivisection and so on, to garner attention and presumably wealth.

Robert Boyle gave traveling demonstrations of his famous air-pump, doing cruel things to animals to show that he created a vacuum.  Gall hustled his phrenology theory about skull shape and mental traits.  In the age of sail, people returning from expeditions to the far unknown gave lurid reports (thrills for paying audiences) and brought back exotica (dead and stuffed).  The captain of the Beagle, the ship on which Darwin sailed, brought live, unstuffed Fuegians back to England for display, among other such examples.

Yes, showman science isn't new.  And perhaps because of the various facets of the profit motive (now perhaps especially attending biomedical research) we see what seems to be increasingly common reports of corruption even among prominent senior (not just desperate junior) academic scientists.  This presumably results from the irresistible lure of lucre or pressure for attention and prominence.  Getting funded and attention mean having a career, when promotion, salaries, tenure, and prestige depend on how much rather than on what.  Ah, well, human fallibility!

The daily press feeds on, perpetuates (and profits from) simplistic claims of discovery along with breathless announcements that are often basically and culpably exaggerated promises.  Universities, hungry for grants, overhead, and attention, are fully in the game.  Showboat science isn't new, but I think has palpably ballooned in recent decades.  Among other things, scientists intentionally, with self-interest, routinely sow a sense of urgency.  Eeeeek!!

So should there be pressure on scientists to quiet down and stop relentless lobbying in every conceivable way?  My personal (probably reactionary!) view is a definite 'yes!':  we should discourage, or even somehow penalize showmanship of this sort.  The public has a right to know what they're paying for, but we should fund science without forcing it to be such a competitive and entrepreneurial system that must be manipulated by 'going public', by advertising.  If we want science to be done--and we should--then we should support it properly.

In a more balanced world, if you're hired as a science professor, the university owes you a salary, a lab, and resources to do what they hired you to do.  A professor's job should not depend on being a sales agent for oneself and the university, as it very often is, sometimes rather explicitly today.  Eeeeek!!

The imp of the perverse--in science today
One of Poe's stories was The Imp of the Perverse.  The narrator remarks upon our apparent perverse drive to do just the opposite of what we think--or know--that we should do.

The Imp of the Perverse.  Drawing by Arthur Rackham (source: Wiki entry on the story)
I won't give any spoilers, since you can enjoy it for yourself.  (Eeeeek!!)  But I think it has relevance to today's attitudes in science.  Science should be--our self-mythology is that it is--a dispassionate search for the truth about Nature.  Self-interest, biased perspectives, and other subjective aspects of our nature are to be avoided as much as possible.  But the imp of our perverse is that it has become (quoth the raven) ever-more important that science be personally self-serving.  It is hard to prevent ourselves, our imp, from blurting out that truth (though it is often acknowledged quietly, in private).

On the good side, careers in science have become more accessible to those not from the societal elite.  The down side is that therefore we have to sing for our supper.  Darwin and most others of science lore were basically of independent means.  They didn't do science as a career, but as a calling.

Of course, as science has become more bureaucratic, bourgeois, and routine, Nature yields where mythology--lore, dogma, and religion--had held forth in the past.  So, it is not clear whose interest that imp is serving.  That's more than a bit unnerving!  Eeeeek!!

Science 'ethics': can they be mainly fictional, too?
Each human society does things in some way, and things do get done.  Indeed, having been trained as an anthropologist, perhaps I shouldn't be disturbed or surprised by the crass aspects of science--nor that this predictably includes increasingly frequent actual fraud egged on by the imp of the pressure of self-interest.  Eeeeek!!

Our mythology of 'science' is the dispassionate attempt to understand Nature.  But maybe that's really what it is: a myth.  It is our way of pursuing knowledge, which science, of course, does.  And in the process, as predecessors such as those I named above show, gaming science is not new.  So isn't this just how human societies are, imperfect because we're imperfect beings?  Is there reason to try, at least, to resist the accelerating self-promotion, and to put more resources not just to careers but to the substance of real problems that we ought to try to solve?

Or should we just admire how our scientists have learned to work the system--that we let costly projects become entrenched, train excess research personnel, scare the public about disease, or make glowing false promises to get them to put money in the plate every tax year?  In the process, perhaps real solutions to problems are delayed, and we produce many more scientists than there are jobs, because one criterion for a successful lab is its size.

Were he alive and a witness to this situation, Poe might have fun dramatizing how science has become, though wonderful for some, for many, a horrible nightmare: Eeeeek!!

Thursday, September 13, 2018

From Darwin's own thoughts. Part IV.

Here is the fourth, and final installment of annotated quotes from Charles Darwin's autobiography (my comments in blue):

"As soon as I had become, in the year 1837 or 1838, convinced that species were mutable productions, I could not avoid the belief that man must come under the same law. Accordingly I collected notes on the subject for my own satisfaction, and not for a long time with any intention of publishing."

Darwin realizes that humans must have evolved, too.  There was no reason, from what he could observe, to except us.  But he knew the dangers of saying so!  We are still under some societal pressure to disavow evolutionary theory, not unlike his days.  Denial is all around us....

"My strength will then probably be exhausted, and I shall be ready to exclaim "Nunc dimittis.""

That means something like 'Enough--time to go!'

"My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive. A man with a mind more highly organised or better constituted than mine, would not, I suppose, have thus suffered; and if I had to live my life again, I would have made a rule to read some poetry and listen to some music at least once every week; for perhaps the parts of my brain now atrophied would thus have been kept active through use.  The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature."

Darwin bemoans his intellectual narrowness.  He elsewhere (as we've seen) refers to books or poetry he no longer reads or once liked.  How many of us are pressured to be technophilic workaholics, letting some of the deepest pleasures of life pass us by?  Take heed!

"....the 'Origin of Species  is one long argument from the beginning to the end, and it has convinced not a few able men."

"On the favourable side of the balance, I think that I am superior to the common run of men in noticing things which easily escape attention, and in observing them carefully."

A lesson might be not to fall into just accepting fads or what is selling these days, but to pay attention, for yourself, to the nature of what you are studying.

"This has naturally led me to distrust greatly deductive reasoning in the mixed sciences. On the other hand, I am not very sceptical,— a frame of mind which I believe to be injurious to the progress of science."

He means observe the world as it is, don't just accept theory or speculative ideas uncritically.  In not being skeptical he may mean that he believes there are truths out there, generalities or theories, that can be accepted if there is supporting evidence.

"So that here a belief— if indeed a statement with no definite idea attached to it can be called a belief— had spread over almost the whole of England without any vestige of evidence."

One can identify fads (like 'Big Data' or 'precision genomic medicine' or other 'omics?) that feed the System.  But maybe closer, clearer, more patient thinking and observing could serve our impatient results-counting generation well instead.

"Lastly, I have had ample leisure from not having to earn my own bread."

Ah, to be wealthy!

.  .  .  .  .  .  .  .  .  .

My afterthoughts:
So, why did I extract and post these bits by Darwin?  I think he gives us a lot to think about.  This is not just about particular facts, or even the idea of evolution which was, in fact, 'in the air' despite Darwin's denial.

We might not have had the same exact theory today, as Darwin proposed it.  We have vastly more data, an understanding of inheritance, and even much better ideas about species distributions (for example, we know about continental drift).  We have more subtle understanding of the roles of natural selection and chance (genetic drift).  We have much more information on the complex nature of genetic control on our traits.  A century and a half of research, framed by the notion of evolution has framed our investigations themselves.  But the central idea of evolution stands without doubt among scientists.  Nothing diminishes Darwin's patience, observations, experiments, persistence, dedication to using detail to build a general picture, and, by no means least of all, his honor.

Careerist pressures are today often, if not largely, antithetical to the personal traits that worked so well for Darwin.  So, aside from seeing the thoughts of one of the most insightful of all scientists, there are lessons for our own time in Darwin's reprise of his life.

Wednesday, September 12, 2018

From Darwin's own thoughts. Part III.

This is the third installment of my annotated selection of pithy quotes from Darwin's autobiography (my comments in blue):

Darwin and Wallace both felt inspiration, or a vital explanatory link, in Thomas Malthus' book:
"In October 1838, that is, fifteen months after I had begun my systematic enquiry, I happened to read for amusement 'Malthus on Population,' and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. The result of this would be the formation of new species."

Darwin's reflections on his theory of evolution and his most important work:
"The solution, as I believe, is that the modified offspring of all dominant and increasing forms tend to become adapted to many and highly diversified places in the economy of nature."

"This shows how necessary it is that any new view should be explained at considerable length in order to arouse public attention."

"Though considerably added to and corrected in the later editions, it has remained substantially the same book....It is no doubt the chief work of my life."

"It has sometimes been said that the success of the 'Origin' proved "that the subject was in the air," or "that men's minds were prepared for it." I do not think that this is strictly true, for I occasionally sounded not a few naturalists, and never happened to come across a single one who seemed to doubt about the permanence of species."

Darwin was most elaborating and saw deeply, but actually, there were ideas floating around, as Wallace (but also others, too) proves.  This undoubtedly affects ideas we think about today, too, but we can find our like-thinking contemporaries--or predecessors--by web-searching (if we want to).

"Another element in the success of the book was its moderate size; and this I owe to the appearance of Mr. Wallace's essay; had I published on the scale in which I began to write in 1856, the book would have been four or five times as large as the 'Origin,' and very few would have had the patience to read it.

I cared very little whether men attributed most originality to me or Wallace; and his essay no doubt aided in the reception of the theory."

Darwin, ever the generous man.  But his societal advantages guaranteed his preeminence, and he was more thorough.  Wallace's ideas were more group- than individual-oriented, and may have some merit although Darwinites take pleasure in scorning them.  And nobody can refer to most of Darwin's books as other than quite long!

"Hardly any point gave me so much satisfaction when I was at work on the 'Origin,' as the explanation of the wide difference in many classes between the embryo and the adult animal, and of the close resemblance of the embryos within the same class. No notice of this point was taken, as far as I remember, in the early reviews of the 'Origin,' and I recollect expressing my surprise on this head in a letter to Asa Gray.

Darwin relied on embryological data and ideas from his predecessors, particularly von Baer  in Germany.

"Towards the end of the work I give my well-abused hypothesis of Pangenesis. An unverified hypothesis is of little or no value; but if anyone should hereafter be led to make observations by which some such hypothesis could be established, I shall have done good service, as an astonishing number of isolated facts can be thus connected together and rendered intelligible."

Darwin is right about his idea of Pangenesis (the idea that each part of the body emits particles that he called 'gemmules' that congregate in the gonads and contribute to the gametes) being of 'little or no value', and indeed it was very much like the Lamarckian view he sneered at.  But what is the role for free speculation in science?  Darwin's speculating was pretty clear, since there was no data supporting 'gemmules' or 'pangenesis'.  Yet, as long as they are clearly  marked as guessing, can intelligent guessing stimulate creative thought?  Is it better, or worse, than just saying "we have no idea how this works...."?

Tuesday, September 11, 2018

From Darwin's own thoughts. Part II.

This is part two, a continuation of my annotated selection of quotes from Charles Darwin's autobiography (my comments in blue):

"In July I opened my first note-book for facts in relation to the Origin of Species, about which I had long reflected, and never ceased working for the next twenty years."

Again, evidence of such patience!  And the result shows why one need not, perhaps should not, rush to publish.

(By Charles Lyell, the distinguished senior geologist and close Darwin friend): ""What a good thing it would be if every scientific man was to die when sixty years old, as afterwards he would be sure to oppose all new doctrines." But he hoped that now he might be allowed to live."

Keeping new and fresh while getting old?  Rarely!  Make your mark while you can....

"Our fixing ourselves here has answered admirably in one way, which we did not anticipate, namely, by being very convenient for frequent visits from our children."

Keeping priorities about what means most in life.

"I have therefore nothing to record during the rest of my life, except the publication of my several books."

On his very extensive many-years' work on barnacles:  
"Nevertheless,  I doubt whether the work was worth the consumption of so much time."

The question and how he mused about it:
"It was evident that such facts as these, as well as many others, could only be explained on the supposition that species gradually become modified; and the subject haunted me. But it was equally evident that neither the action of the surrounding conditions, nor the will of the organisms (especially in the case of plants) could account for the innumerable cases in which organisms of every kind are beautifully adapted to their habits of life— for instance, a woodpecker or a tree-frog to climb trees, or a seed for dispersal by hooks or plumes. I had always been much struck by such adaptations, and until these could be explained it seemed to me almost useless to endeavour to prove by indirect evidence that species have been modified.

I worked on true Baconian principles, and without any theory collected facts on a wholesale scale, more especially with respect to domesticated productions."

Darwin the observer, not the rusher to conclusions!

"I soon perceived that selection was the keystone of man's success in making useful races of animals and plants. But how selection could be applied to organisms living in a state of nature remained for some time a mystery to me."

Yes, agriculture has done it--changed species' traits--but how does it happen in Nature?

Monday, September 10, 2018

From Darwin's own thoughts. Part I.

I have just been re-reading Charles Darwin's autobiography.  He wrote it with his son's encouragement shortly before the great man passed away in 1882, and was first published in 1887.  I think he wrote it to tell his children and so on about his famous life.  Yet as famous as he had become, he is as modest as one would expect from that exemplar of the best of humanity.

I encourage anyone in the life sciences, who doesn't presume to think s/he already knows everything, to read it, for reasons I'll suggest below.  It has various versions, as his son Francis edited it a bit, redacting some personal family-related  comments (these were later restored, but are unimportant here). You can find it here.

I thought that some of the things he said would be worth posting on a site like this.  Darwin was right about many things, and even he was wrong about others (as, indeed, he himself freely says).  But it is his thinking, his perspective, standards, reasons, and outlook that are important.  So what follows are some quotes that I chose (easy to find by searching the ebook), with my reflections separated in italics and in blue.  (Because there are many pithy quotes, I've split this into four successive posts):

"To my deep mortification my father once said to me, "You care for nothing but shooting, dogs, and rat-catching, and you will be a disgrace to yourself and all your family."

Darwin was an idler as a privileged young gentleman, but, to our great benefit, circumstances grabbed his attention and serious side. And he explains it thus:

"Looking back as well as I can at my character during my school life, the only qualities which at this period promised well for the future, were, that I had strong and diversified tastes, much zeal for whatever interested me, and a keen pleasure in understanding any complex subject or thing.


I mention this because later in life I wholly lost, to my great regret, all pleasure from poetry of any kind, including Shakespeare."

Darwin more than once admitted, or even bemoaned, his narrow focus and neglect of some of the finer things in life.  Yes, he was successful, but this could be a lesson for us all: keep a balance!

"I almost made up my mind to begin collecting all the insects which I could find dead, for on consulting my sister I concluded that it was not right to kill insects for the sake of making a collection."

Even Darwin saw the evil in killing other living things just to gawk at them.  We do it routinely, even including mammals (mice, etc.), but to salve our conscience (for those who have one) we get IRB approval first, to keep their suffering under at least some constraint and prevent our suffering from lack of a project to do.

"This was the best part of my education at school, for it showed me practically the meaning of experimental science."

He observed rather than simply conjectured, and his patience and eye for detail and for identifying the critical variables were at the root of his success.

"...but to my mind there are no advantages and many disadvantages in lectures compared with reading."

Ooops, professors!  Some of us do need to hear a message live and have it explained.  Darwin, though, had the drive, and patience, to study a subject in great detail.  How many of us have that?

"At this time I admired greatly the 'Zoonomia;' but on reading it a second time after an interval of ten or fifteen years, I was much disappointed; the proportion of speculation being so large to the facts given.

....in after years I have deeply regretted that I did not proceed far enough at least to understand something of the great leading principles of mathematics, for men thus endowed seem to have an extra sense."

He reasoned in his own way, and did't really suffer by his non-numerical abilities.  Maybe he was not misled by math's oversimplification and rigidity?  Maybe we rely far too much on the latter, as a safer and quicker course to 'results', than patient, deeper thinking?

"During my last year at Cambridge, I read with care and profound interest Humboldt's 'Personal Narrative'."

"...science consists in grouping facts so that general laws or conclusions may be drawn from them."

If he was anything, it was a patient, careful sponge for detail.  And reading stimulated his original thinking.

"I heard that I had run a very narrow risk of being rejected, on account of the shape of my nose! He was an ardent disciple of Lavater, and was convinced that he could judge of a man's character by the outline of his features; and he doubted whether any one with my nose could possess sufficient energy and determination for the voyage."

Here he's talking about having almost been rejected for the voyage that became the basis of his life's work by Fitzroy, the captain of the Beagle.  Beware of hoaxes even in science!  We see unwarranted speculative conclusions being asserted almost every week in the news media, and even in journals (though there, couched in dense professorialized terms!).

"The investigation of the geology of all the places visited was far more important, as reasoning here comes into play."

"Everything about which I thought or read was made to bear directly on what I had seen or was likely to see; and this habit of mind was continued during the five years of the voyage."

Again, his integrative detail-sponging patience and reasoning.  No rush to conclusions (or to print).  Indeed, he waited for 25 years before publishing his ideas on evolution, and only did so then when he was prompted by Alfred Russel Wallace's discovery of the same ideas.  

"The sight of a naked savage in his native land is an event which an never be forgotten."

(Here he's writing of being in Tierra del Fuego.  But he did not think of such people as inferior, as his experience later makes clear)

"Nor must I pass over the discovery of the singular relations of the animals and plants inhabiting the several islands of the Galapagos archipelago, and of all of them to the inhabitants of South America."

We know how important that set of observations was!  The islands are still under close observation.

"But I was also ambitious to take a fair place among scientific men,— whether more ambitious or less so than most of my fellow-workers, I can form no opinion."

"...I am sure that I have never turned one inch out of my course to gain fame."

Ambition, yes--but egotism and show-boating, never: no rushing to the news media, no spin doctors!

Thursday, June 14, 2018

A new biomedical insight?

Here is a thoughtful and timely quote:
". . . . as no single disease can be fully understood in a living person; for every living person has his individual peculiarities and always has his own peculiar, new, complex complaints unknown to medicine—not a disease of the lungs, of the kidneys, of the skin, of the heart, and so on, as described in medical books, but a disease that consists of one out of the innumerable combinations of ailments of those organs. This simple reflection can never occur to doctors . . . . because it is the work of their life to undertake the cure of disease, because it is for that that they are paid, and on that they have wasted the best years of their life.  And what is more, that reflection could not occur to the doctors because they saw that they unquestionably were of use . . .  not because they made the patient swallow drugs, mostly injurious (the injury done by them was hardly perceptible because they were given in such small doses). They were of use, were needed, were indispensable in fact (for the same reason that there have always been, and always will be, reputed healers, witches, homÅ“opaths and allopaths), because they satisfied the moral cravings of the patient . . . . They satisfied that eternal human need of hope for relief, that need for sympathetic action that is felt in the presence of suffering, that need that is shown in its simplest form in the little child, who must have the place rubbed when it has hurt itself. The child . . . . feels better for the kissing and rubbing. The child cannot believe that these stronger, cleverer creatures have not the power to relieve its pain. . . ."
The language seems a bit arcane, and this is a translation, but its cogency as a justification for today's Big Data feeding frenzy is clear.  People who are ill, or facing death, will naturally grasp at whatever straws may be offered them.  In one way or another, this has been written about even back to Hippocrates.

Of course, palliation or cure of what disorders can be eased or cured should be the first order and obligation of medicine.  Where nothing like that is clearly known, trials of possible treatments are surely in order, if the patient understands at least the basic nature of the research, for example, that some are being given placebos while others the treatment under investigation.  Science doesn't know everything, and we often must learn the hard way, by trial and error.

Given that, perhaps the most important job of responsible science is to temper its claims, and to offer doses of the reality that life is a temporary arrangement, and that we need to get the most out of that bit of it to which we are privileged to have.  So research investment should be focused on tractable, definable problems, not grandiose open-ended schemes.  But promises of the latter are nothing new to society (in medicine or other realms of life).

The problem with false promises, by preachers of any type, is that they mislead the gullible, and in many cases this is known by those making the promises--or could and should be known.  The role of false promise in religion is perhaps debatable, but its role in science, while understandable given human ego and the struggle for attention, careers, and funding, is toxic.  People suffering, of poverty, hardship, or disease, seek and deserve solace.  But science needs to be protected from the temptations of huckstering, so that it can do its very important business as objectively as is humanly possible. 

By the way, the quote is from about 150 years ago, from War and Peace, Tolstoy's 1869 masterpiece about the nature of causation in human affairs.