Tuesday, September 30, 2014

Notes for a late-summer's day.....

Well, it's a slow day, one of the last of warm ones for the year and I'm about to go for a bike ride, but before I go there are a few little things I'll quickly comment on.

One is the banner headline in the 12 September Science about a huge aquatic dinosaur, proclaiming "Giant dinosaur was a terror of Cretaceous waterways!" The Hollywood image of a massive terror is illustrated in the story; see below.  Wow! One need not doubt that this was a nasty beast and a terror to its predators.  But this is purportedly a science journal, not a Hollywood ad vehicle.  Most species have predators that prey on them, and for those species, the predators are surely terrors.  In this case, those relatively few species big enough to be seen by and yet not so agile as to escape from the giant dinosaur would be those experiencing terror.  And, in fact, this huge predator of the waterways probably left the vast majority of waterway species alone and was not a terror for them at all!  The same applies to the Cretaceous as to any period in the history of life.


Artist's imagination, from the Science article, from their website

We can wonder at how such a huge monstrosity of a species could evolve, much less swim, given the energetic, thermal, and mechanical challenges of being that big, not to mention of finding enough to eat. But we don't need the melodramatic drawing to make the point (after all, what was found were bones, not the flesh).  We may learn from the history this shows, but could do without the histrionics.

Next on the list is the dramatic cover of Nature on 18 September.  This figure reports results that use genomic sequence data to argue that modern Europeans carry genes from three different ancestral populations.  The massive author list is another characteristic of modern science whereby if you as much as walk by the office where someone is working on the paper and ask what they're doing then you qualify as an author and can report to your Chair and Dean that you have a cover article in Nature.


Nature cover, 18 September 2014, from their website
The story reports that genetic data on modern European variation and from some ancient DNA sources suggests a melding of people migrating (or gene-flowing) in from three basic areas, a northern or Siberian, a more expected western European, and a Middle Eastern source.  Taking place millennia ago, and/or over millennia, the result is that Europeans do not constitute a standing discrete 'race', but are a mix, the result of population history.

There is absolutely nothing wrong with this paper(as far as we can tell). But note that the cover illustrations (this  is, after all a science journal) are again artist's reconstructions of our supposed distant ancestors, not the data themselves.  This is taking reifying liberties with the science, something Nature does on a regular basis both with illustrations and catchy-cute pun-laden story ads on their covers.

One can and we think should ask why, unless this is just more Hollywood and advertising, this is in any way a Nature cover story.  Interesting as it is, it is no breakthrough of modern science (the issues and evidence have been building for decades--yes, decades).  This story should be in a specialist journal (like the Am. J.  Physical Anthropology), where before our marketing age it would have been.  There, it would be seen by other anthropologists with expertise in human migration history, vetted for other interpretations if any, and would have become part of our knowledge of human origins, and material for textbooks.  But it just doesn't warrant being splashed as if it is some sort of innovative discovery.  Again, the nature of science has changed, and one can ask why we have moved in this pop-sci direction, and whether that's good.

Are we just too, too stodgy?  Probably.  But isn't the translation of science to videographic presentation a pandering to a bourgeois culture that is bored with details?  Will it grab people and draw them into science?  Even if it does the latter, which it may well, doesn't it give the impression that the daily work of science is exciting, and the technical demands of doing good science minimal?  Or is it mainly a way to employ graphic artists and sell magazines?  Of course, this is how our culture works in the Advertising Age, so the answer is probably moot.

Anyway, back out to the bike path and garden, on a day too nice to think about anything very technical, enjoying fall before it passes, and (today at least) getting out before our town is overrun with drunken football fans.

Monday, September 29, 2014

Predicting future environments: it's impossible in principle

We often caution here on MT about predicting the future.  There's an idea afoot that once we know our genomes we'll be able to predict who will get what chronic disease, even when the disease is the result of genes interacting with environmental factors.  Or even multiple genes and multiple environments.  But, as we've often said, making such predictions with known accuracy is impossible in principle.  And here's why.

A BBC television series in the 1970's, "A Taste of Britain," hosted by Derek Cooper, set out to document traditional foods and food practices in Britain before they disappeared.  The long-running BBC Radio 4 show, "The Food Programme," revisits this series by traveling to the places documented 40 years ago to find out what became of the foods, the people, and the traditions the original program had captured on film.  "A Taste of Britain Revisited" is a fascinating glimpse into traditional foods and how they've fared in the intervening years.

Cooper urgently tracked down cockle pickers, sewin fishermen, truffle hunters, cheese makers, colliers wives, makers of Yorkshire Pudding, makers of Welsh cakes and laverbread, and more, certain that the traditions he was documenting would soon be lost.

Cockle pickers, Wales: Daily Mail, 2008

Fishing for sewin (sea trout) in traditional coracles, Wales; Wikipedia
And some were.  Once miners were moved into council housing in Yorkshire, away from coal-fired ovens, Yorkshire Pudding was never the same.  Native English crayfish will soon be extinct because a heartier North American variety has taken over, bringing a virus that kills the natives with it.  Cooper filmed the last traditional hearth in Wales, which was replaced but something more modern just weeks later.

And some have changed. The traditional Dorset Blue Vinny cheese that Cooper was told he was eating was in fact a second-rate Stilton, the real thing not having been made since World War II when the making of soft and semi-soft cheeses was banned, and only hard, transportable cheeses like cheddar could be made, to be sent to the cities.  But the recipe for Blue Vinny was found in the 1980's, and the cheese revived.  Cockles were in short supply in the 1970's and had to be imported.  The numbers increased for a while after that, fell and increased again, but now aren't as plentiful as they were, and cockle gatherers don't go to the beach by horse and cart any longer.

Horse and Cart on Beach, William Ayerst  Ingram; Wikipedia

And some have remained the same, but the economics have changed.  Welsh laver, or seaweed, is now being sold in Japan, when just 40 years ago getting it to London was a major accomplishment.   Businesses have been bought up by foreign companies.  A Spanish company now buys British cockles, and they are now being canned and sold in Spain. "Cans is something we never thought we'd put cockles into, but if that's the way they want them, that's the way they can have them."

So, Cooper wasn't entirely right that he was documenting the final days of many traditions and foods, though he was certainly documenting change.  I don't know what he thought was going to replace traditional foods, but perhaps the biggest change over the last 40 years was the rise of processed foods.  Whether or not he foresaw that I don't know, but even if he or anyone else had, the health effects couldn't have been predicted.  Indeed, even with all the evidence before us, we can't really say what the health effects of any of the specific changes have been.  Yes, people are more obese, have more heart disease, stroke, hypertension, cancers and so on, but in a very real sense, a major 'cause' of these diseases is control of infectious diseases.  

But, that's not the real point of this post.  The point is that it looks as though the common, late onset chronic diseases we are dying of now are the result of complex interactions between genes and environment.  Epidemiologists have been trying to identify environmental risk factors for decades, with only modest success -- meaning that it's not clear that we know which aspects of our environments are linked with risk.  But even if we did, these two series of programs on traditional foods in Britain make it clear that while we may fit today's disease cases to yesterday's exposures, it's impossible to predict what people will be eating not far into the future, so that even if we do identify risk alleles and risky environments, we don't know that carriers of that allele will still be exposed to risky environments not many years from now.  So, we can't predict their disease with more than retrospective accuracy, when we know from experience that risks of the same traits change rapidly and substantially.

Here's an example.  Type 2 diabetes has risen to epidemic levels in Native American populations, and Mexican Americans who are admixed with Native Americans, since World War II.  The nature of the disease and the pattern of the epidemic suggests that a fairly simple genetic background may be responsible.  But, even if so, that's only in the provocative environment of the last 60 or so years.  That's because there was little to no type 2 diabetes in Native Americans, or Mexican Americans, before then, yet whichever alleles are responding now, leading to glucose intolerance and so forth have basically not changed in frequency. It's the environment  -- diet and exercise, presumably -- that has changed.  This epidemic could simply not have been predicted 60 years ago, even if we'd been genotyping people then, because whatever risk alleles are now responsible were not causing disease before the environment changed.

So, the promise that genotyping every infant at birth will allow us to predict the late onset, complex diseases they will eventually have is unlikely to be met.  Instead, for many or most complex traits, it's an illusion that's being sold as fact.

Wednesday, September 24, 2014

What are scientific 'revolutions'? Part II: Qualitative paradigm adjustments

As we described yesterday, when scientists are doing what Thomas Kuhn referred to as 'normal' science, we are working within a given theoretical framework--or 'paradigm' as he called it--and using new technologies, data, and approaches to refine our understanding of Nature in terms of that framework. In biology now, normal science is couched within the evolutionary framework, the acceptance of descent with modification from a common ancestor.  We might tweak our views about the importance of natural selection vs drift, say, but that doesn't change the paradigm.

But there are energetic, and sometimes fierce discussions about just how we should go about doing our work. These discussions often involve the basic statistical methods on which our inferences lie.  We've talked about the statistical aspects of life science in numerous past posts. Today, I want to write about an aspect that relates to notions of 'revolution' in science, or what Kuhn called paradigm shifts.  What follows is my own view, not necessarily that of anybody else (including the late Kuhn).

xkcd

For many if not most aspects of modern science, we express basic truths mathematically in terms of some parameters.  These include values such as Newton's gravitational constant and any number of basically fixed values for atomic properties and interactions.  Such parameters of Nature are not known with perfect precision, but they are assumed to have some universally fixed value, which is estimated by various methods.  The better the method or data, the closer the estimate is held to be, relative to the true value.  Good science is assumed to approach such values asymptotically, even if we can never reach that value without any error or misestimation.

This is not the same as showing that the value is 'true', or that the underlying theory that asserts there is such a value is true.  Most statistical tests evaluate data relative to some assumed truth or property of truth, or some optimizing criterion given our assumptions about what's going on, but many scientists think that viewing results this way is a conceptual mistake.  They argue that our knowledge leads only to some degree of confidence, a subjective feeling, about an interpretation of Nature. Using approaches generally referred to as 'Bayesian', it's argued that all we can really do is refine our choice of properties of nature that we have most confidence in.  They rarely use the terms 'belief' or 'faith' in the preferred explanation, because 'confidence' carries a stronger sense of an acceptance that can be systematically changed.  The difference between Bayesian approaches and purely subjective hunches about Nature is that Bayesian approaches have a rigorous and in that sense highly objective format.

This comes from a famous rearrangement of a basic fact of probabilities, credited to Thomas Bayes. It is a rearrangement of basic laws of probability, and it goes like this:

     p(Hypothesis|Evidence) = p(Evidence|Hypothesis) p(Hypothesis) / p(Evidence)

This says that the probability of some Hypothesis we may be interested in is equal to the probability of that evidence E if the Hypothesis were truly true, times the probability that we have in mind for the Hypothesis, all divided by the overall probability of the Evidence; that is, there must be a lot of ways that the Evidence might arise (or we'd already know our Hypothesis was true!), so you sum up the probaiblity of the data if H is true (weighted by your prior probability that it is, and separately the probability of the data if an alternative to H is true weighted by the probability of its not being true. It's somewhat elusive, but here's an oversimplified example:

Suppose we believe that a coin is fair. But there's a chance that it isn't.  In advance of doing any coin-flipping, we might express our lack of knowledge by saying the chance that the coin is fair is, say, 50%, since we have no way to actually know if it is or isn't.  But now we flip the coin some number of times.  If it's fair, the probability of it coming up Heads equals 50%, or p(H) = 0.5 per flip.  But suppose we observe 60% Heads.  A fair coin could yield such results and we can calculate the probability of that happening.  But an unfair coin could also generate such a result.

For simplicity, let's say we observe HHHTT.  For a fair coin, with p(H) = 1/2, the probability of this result is (1/2)(1/2)(1/2)(1/2)(1/2)(1/2) = 0.0312, but if the coin is unfair in a way that yields 60% Heads, the probability of this result is (0.6)(0.6)(0.6)(0.4)(0.4) = 0.035.  Using the formula above, the probability that the coin is fair actually drops from 50% to about 31%: we're less confident about the coin's fairness. If we kept flipping and getting such results, that value would continue dropping, as we became less confident that it's fair and increasingly confident that its true probability of Heads was 0.6 instead of 0.5.  We might also ask if the probability of it being fair is, say, zero, or 1/8, or 0.122467--that is, we can test any value between zero (no chance it's fair) to 1.0 (completely sure it's fair).

The basic idea is that we have some prior reason, or probability (p(H)) that the Hypothesis is true and we gather some new Evidence to evaluate that probability, and we adjust it in light of the new Evidence.  The adjusted value is called the posterior (to the new data) probability of the Hypothesis, and Bayes' theorem provides a way to make that adjustment.  Since we assume that something must be true, Bayes' formula provides a systematic way to change what we believe about competing explanations.  That is, our prior probability is less than 1.0 (certainty of our Hypothesis) which implies that there are other hypotheses that might be true instead.  The use of Bayes' theorem adjusts our confidence in our specified Hypothesis, but doesn't say or show that it is true.  Advocates of a Bayesian approach argue that this is the reality we must accept, and that Bayesian approaches tell us how to get a best estimate based on current knowledge.  It's always possible that we're not approaching truth in any absolute sense.  

A key aspect of the Bayesian view of knowledge is that the explanation is about the probability of the data arising if our preferred explanation is true, accepting that it might or might not be.  It assigns quantitative criteria for alternative explanations whose relative probability can be expressed--that is, the set of possible hypotheses each have a probability (a value between zero and 1),  and their sum exhausts all possibilities (just as Heads and Tails exhaust the possible flip outcomes, or a coin must either be fair or not-fair).

OK, OK so what does this have to do with scientific 'revolutions'?
The basic idea of Bayesian analysis is that it provides a technically rigorous way to express subjective confidence in a scientific context.  It provides a means to use increasing amounts of data to adjust the level of confidence we assign to competing hypotheses, and identify the Hypothesis that we prefer.

This is a good way to express confidence rather than a yes-no illusion of ultimate truth, and has found widespread use.  However, its use does depend on whether the various aspects of experiments and hypothesis can adequately be expressed in probabilistic terms that accurately reflect how the real world is--and, for example, important causal components may be missing, or the range of possibilities may not be expressible in terms of probability distributions.

I am by no means an expert, but a leading proponent of Bayesian approaches, the late ET Jaynes, said this in his classical text on the subject (Probability Theory, Cambridge Press, 2003):
Before Bayesian methods can be used, a problem must be developed beyond the 'exploratory phase' to the point where he it has enough structure to determine all the needed apparatus (a model, sample space, hypothesis space prior probabilities, sampling distribution).
This captures the relevant point for me here, in the context of the idea of scientific revolutions or paradigm shifts.  I acknowledge that in my personal view, and this is about philosophy of inference, such terms should be used only for what is perhaps their original reference, the major and stunning changes like the Darwinian revolution, and not the more pedestrian applications of everyday scientific life that are nonetheless casually referred to as revolutions.

These issues are (hotly) debated, but I feel we should make a distinction between scientific refinement and scientific revolutions.  To me, Bayesian analysis is a systematic way to refine a numerical estimate of the relative probability of an idea about Nature compared to other ideas that could be correct.  The prior probability of the best of these alternatives should asymptotically with increased amounts of data (as schematically shown in the figure below), unless something's wrong with the  conceptualization of the problem.  I think this is conceptually very different from having a given scientific 'paradigm' replace another with which it is incommensurable.   



Where it's useful, Bayesian analysis is about altered ideas among what are clearly commensurable hypotheses--based on different values of the same parameters. Usually, the alternative hypotheses are not very different, in fact, so that for example, a coin has some bias in its probability of Heads, ranging from no-chance to fair (50%) to inevitable; but assuming such things as that the flips are all done the same way and the results generated by flipping are probabilistic by nature.

In my view, Bayesian analysis is a good way to work through issues within a given theoretical framework, or paradigm, and it has many strong and persuasive advocates.  But is not a way to achieve a scientific revolution nor does it reflect one.  Sometimes the idea is used rather casually, as if formal Bayesian analysis can adjudicate between truly incomparable ideas; there, to me, we simply must rely on our subjective evaluations. One can't, of course, predict when or even whether a truly revolutionary change--a paradigm shift, if you will--will occur, or even if such is needed.

Ptolemaic epicycles added accuracy to the predictions of planetary motion, at the price of being cumbersome.  One could have applied Bayesian analysis to the problem at the time, had the method been available.  The Copernican revolution changed the basic structure of the underlying notion of what was going on.  One might perhaps construct Bayesian analysis that would evaluate the differences by somehow expressing planetary positions in probabilistic terms in both systems and allow one to pick a preference, but I think this would be rather forced--and, most importantly, a post hoc way to evaluate things (that is, only after we have both models to compare).  In fact, in this case one wouldn't really say one view was true and the other not--they are different ways of describing the same motions of bodies moving around in space relative to each other, and the decision of how to model that is essentially one of mathematical convenience.

I think the situation is much clearer in biology.  Creationist ideas about when and where species were created or how they related to each other in terms of what was called the Great Chain of Being, could have been adjusted by Bayesian approaches as, for example, the dates of fossils being discovered could refine estimates of when God created the species involved.  But Bayesian analysis is inappropriate for deciding whether creationism or evolution is the best hypothesis for accounting for life's diversity in the first place.  The choice in both approaches would be a subjective one, but without artificial contortions the two hypotheses are not probabilistic alternatives in a very meaningful sense. That's what incommensurability, which applies in this case I think, implies.  You can't very meaningfully assign a 'probability' to whether creationism or evolution is true, even if the evidence is overwhelmingly in favor of the latter.

Current approaches
These posts express my view of the subject of scientific theory, after decades of working in science during periods of huge changes in knowledge and technology.  I don't think that scientific revolutions are changes in prior probabilities, even if they may reflect them, but are more and different from that.  From this viewpoint, advocates for Bayesian analysis in genomics are refining, but not challenging the basic explanatory framework at all.  One often hears talk of paradigm shifts and use of similar 'revolution' rhetoric, but basically what is happening is just scaling up our current "normal science", because we know how to do that, not necessarily because it's a satisfactory paradigm about life.  And there are many reasons why that is what people normally do, more or less as Kuhn described.  I don't think our basic understanding of the logic of evolution or genetics has changed since I was a graduate student decades ago, even if our definition of a gene, or modes of gene frequency change, or our understanding of mechanisms have been augmented in major ways.

It is of course possible that our current theory of, say genomic causes of disease, is truly true, and what we need to do is refine its precision.  This is, after all, what the great Big Data advocacy asserts: we are on the right track and if you just give us more and more DNA sequence data we'll get there, at least asymptotically.  Some advocate Bayesian approaches to this task, while others use a variety of different statistical criteria for making inferences.

Is this attitude right for what we know of genomics and evolution?  Or is there reason to think that current "normal science" is pushing up against a limit and only a true conceptual revolution, one essentially incommensurate with, or not expressible in the terms of, our current models? In past posts, we've suggested numerous reasons why we think current modes of thought are inadequate.

It's all too easy to speak of scientific revolutions (or to claim, with excitement, that one is in the midst of creating one if only s/he can have bigger grants, which is in fact how this is usually expressed).  It's much harder to find the path to a real conceptual revolution.

Tuesday, September 23, 2014

What are scientific 'revolutions'? Part I: Qualitative paradigm shifts

How do we know what we think we know?  And how do we know how close we are to the 'truth'? These are fundamental questions in life, and especially in science where we expect to be in pursuit of the truth that we assume exists.  We build our work upon an accepted body of trusted knowledge, one that we first spend many years learning, and then even more years contributing to.  But there are always facts that don't quite fit the existing paradigm -- or don't fit at all -- and these can be wrong, or they can make a revolution.

In 1962, Thomas Kuhn published The Structure of Scientific Revolutions.  He built on his earlier work in the history and philosophy of science, The Copernican Revolution (1957) which analyzed  the way the sun-centered Copernican view of planetary motion replaced the long-standing Ptolemaic earth-centered view as an example of how scientific understanding of the world can change.  

In a nutshell, Kuhn says that scientists at any given time usually work within a model or theory, or paradigm as he referred to it, that explains their findings.  This paradigm guides what we do every day as we work away at what Kuhn called "normal science".  There are always unexplained or even apparently contradictory facts that don't easily fit into our working theory, but we do our very best in normal science to fit, or shoe-horn, these anomalies into our current paradigm.  Occasionally, when the lack of fit becomes too great, a 'revolution,' essentially a new theory, is proposed, usually based on a new finding or a new way of synthesizing the data that does a better job of accounting for the anomalies in question (even if it may do less or less well for some known facts).

The new theory dramatically and at one fell swoop accounts for hosts of facts that hadn't fit into the previous working paradigm, including the apparent anomalies.  A key point we'll discuss below is that the new view is not just a quantitative improvement in, say, measurement accuracy or something like that.  It's not technology. Instead, a defining characteristic is that the new view is "incommensurate" with the view it replaced: you cannot express the new view in terms of its predecessor.  It is quickly adopted by the profession in what Kuhn coined a "paradigm shift", which becomes the tool of a new phase of 'normal science'.  This was what he called a scientific 'revolution'.

Motion of SunEarth, and Mars according to heliocentrism (left) and to geocentrism (right), before the Copernican-Galilean-Newtonian revolution. Note the retrograde motion of Mars on the right. Yellow dot, Sun; blue, Earth; red, Mars.
(In order to get a smooth animation, it is assumed that the period of revolution of Mars is exactly 2 years, instead of the actual value, 1.88 years). The orbits are assumed to be circular, in the heliocentric case. Source: Wikipedia, Copernican Revolution

The view that the earth was part of the solar system fundamentally changed the way planetary motion was accounted for.  In the older Ptolemaic system, movements that were supposed to be perfect circles in the perfect spheres of the heavens, did not fit astronomical observations. So occasional little circles of movement (called epicycles) were invoked to explain observations and make predictions more accurate and consistent. But if the sun were viewed as the system's center, then one could account for the motions with ellipses and no epicycles. Refinements were to come along with Newton and Kepler, and Tycho Brahe then showed that geocentric mathematics could also work, with a "geo-heliocentric" system in which the Sun and Moon orbit the Earth (see Wikipedia: Tycho Brahe); but in which the other planets go around the Sun.

However, there have been other examples that reflect the basic Kuhnian idea: Darwin's evolutionary theory replaced one of special creation of the earth's species; quantum theory and relativity added truly revolutionary ideas about space, time and even causal determinism; plate tectonics (continental drift) replaced a diversity of ad hoc accounts for geological forms and changes, and so on.  The basic notions of normal science, working paradigms, and essentially incommensurable replacement of one theory by another may be criticized in detail, but Kuhn's way of explaining the dynamics of science has much to recommend it.

The phrase "paradigm shift" has become canonized in modern science parlance.  It glamorizes the genius (Copernicus, Einstein, Darwin) who was responsible for the change of view, often neglecting others who had roughly the same idea or whose work triggered the iconic figure's work.  And for that reason, and because scientists are mainly middle class drudges who need to feel important, we throw the phrase around rather loosely (often referring to our own work!).  We speak of scientific revolutions now rather casually as if they are occurring, whenever some new finding or technology comes along.  But is that justified?

Generalizations about classical 'paradigm shifts' and revolutions in science
We were lead to write about this because of comments on our recent post on the faith component of science having to do with how we in science view what we think we know.  This and the following post tomorrow are reflections about this, and not intended as an argument with the commenter.

A key relevant question is how we decide that what we assert today is better than what we said yesterday.  If it is different, but not better, then where can we find a sense that we know more, or are closer to the truth?  What if there is no single truth that we hope science is asymptotically approaching--with each new discovery getting closer to a perfect understanding?

At least one aspect of the answer lies in the idea of incommensurability between 'paradigms' as opposed to accuracy within a given paradigm.  Here, I'll focus on genetics and evolution, fields I know at least something substantial about.

Prior to Darwin, in Western culture the prevailing view of life was that species had been individually created by God for His own reasons.  Species might change under husbandry and so on, but they were basically static (though they might become extinct, again for some reason in God's plan), but they didn't morph one into another.  After Darwin, species were viewed as the result of a historical physical process, evolution taking place over time due to physical constraints (natural selection).  In a Darwinian view one cannot measure the nature or arrival of species in terms of events of special creation.  Humans cannot be viewed as specially created at the Beginning with the rest of life created for our use.  Evolution is not just a quantitative description of special creation.  The two views are incommensurable.

In the new 'paradigm', everything changed.  Species and their traits are viewed in terms of historical usage history, context-specific factors that affect what forms could succeed better than other forms relative to each other at the time, not in any external Creator's eye.  Evolution was truly a revolutionary change in the understanding of global diversity in life.  It has had at least as much impact as any other revolutionary conceptual change in any science.  But is it more 'true' or has it given us the truth about life?

Of course, even if the process of speciation is an historical one that takes place gradually by Darwinian means, each species must arise at some specific time.  Is this so different?  Yes!  It's different first because the definition of 'species' is a human-imposed cultural one and because the many processes that could lead populations to be mating-incompatible (the usual definition of 'species') may arise by single events (mutations in chromosome regions required for mating, for example) but they were historical, random changes in DNA, etc.  They were not guided from without with any purpose.  And generally, diversity accumulates along with mating incompatibility, gradually.

And what about natural selection?  It is the theoretically accepted origin of complex traits in living species.  It is a gradual process even if each life or death or conception may be discrete events in time and place.  And, after Darwin, we have had to add chance (genetic drift) into the picture of how genomic structures and what they cause have changed over time.  But such additions modify, but do not at all overthrow the idea of evolution.  They introduce no paradigm shift.

Nor does the discovery that chromosomes contain more than just protein-coding DNA sequence--they have regulatory sequences, sequences involved in DNA's own packaging, and so on.  The idea of gene regulation, or of genes being made of discrete, interrupted sequence regions (coding exons, introns, etc) added new theoretical elements to biology, but they are entirely commensurable with prior views that were non-specific as to just what genes 'are'.  The discovery of the base-pairing nature of DNA and its use of a code for protein sequence and other functions added to our understanding of life, and produced a new theory of genetic causation.  But that theory didn't replace some earlier specific theory about what genes were.  None of this in any serious way was a paradigm shift, even though these discoveries were of momentous importance to our understanding of life.

And then there's the origin of 'life'.  Mustn't that, too, have had a moment of creation?  Biochemists will have to assert that the possibility has always existed since the beginning of the cosmos, but that only when the right ingredients (molecules, pH, temperature, etc.) existed at the same time and place did life start.  It may have had countless molecular origins, but here on earth at least only one such led to life as we know it today.  That is, in a sense, a theory of a moment of occurrence--though not of 'creation'.  So in our modern view it's part of the historical process that is life.

So, biology has had its scientific revolution, and one that shook the earth in very Kuhnian terms. But whether we are closer to the 'truth' about what life is, is itself a rather vague or even unanswerable question.  As technology advances, we could be getting a better and better understanding, and a more complete explanation of the essential nature of life.  Or, forces at work within organisms might be discovered which will lead to fundamentally different kinds of understanding of life.  How can we ever know unless or until that happens?

One way to rephrase this question is to ask whether we can know how 'close' we are to understanding the truth.  We can compare origin theories from many different cultures, including our own Biblical one, but we can't really concoct a quantitative measure of how true they are even relative to each other.  In a sense, all have zero truth except evolution, but that's not very useful, because we have no way to know what new idea may come along to challenge the one that we now believe to be true.  Of course, some people, even some otherwise scientists, accept religious explanations and will simply not acknowledge what I've been saying because they have an incommensurable truth that cannot be compared in this way to evolution other than by forced contortions such as that the Bible should be taken metaphorically and the like.  Or others have a mystical view of universal unity and reincarnation etc. which, like Biblical explanations, cannot really be compared because it doesn't attempt to explain the same things.

But there is another very different way to view scientific progress, typically referred to by the term 'Bayesian', which is often implicitly equated with 'revolutions' or 'paradigm shifts', as a systematic rather than episodic way for scientific truth to become known.  We'll discuss that tomorrow.

Monday, September 22, 2014

Nature vs nurture, and human behavior: where’s the science?

Despite more than a century of work in genetics, sociology, psychology, anthropology, economics, and other disciplines that focus on human behavior, the turning of the Nature-Nurture cycle continues.  Is human behavior hard-wired in our genomes or is it a pattern of responses created from birth onward by experience?  How can it be, given the masses of data that state-of-the-art science has produced, and the quantity of ink that has been spilled to answer the question, that neither side yields an inch?  Indeed, how can their still be sides?  

We can express the debate roughly as follows:
Does there exist in every human being, underlying behavior that is shaped by the social influences around him/her, an inborn predisposition that education might modify, but cannot really change? Is the view that denies this and asserts that we are born with tabula rasa, totally molded by life experience, asserted by people who have not recognized the obvious fact that we are not born with blank faces—or who never compared two infants of even just a few days old, and observed that those infants are not born with blank temperaments that can be molded up at will? Is there, essentially open-ended variation among individuals, genomic causes of behavior, that cannot be over-ridden by a person’s life experience? Is experience ever the key to understanding a person’s behavior, or can genomic knowledge predict the nature of each person?
The prevailing view in our current age, in love as it is with technology and genomics, the Nature view, invokes Darwinian evolutionary theory as its justification for genetic determinism, and asserts that if we do enough genome sequencing, individual behaviors will be predictable from birth, much as the view in the biomedical community that argues that disease risk will be predictable from birth.  In anthropology, this view goes by the rubric ‘Human Biodiversity’, the assertion that is the bio component that drives our diversity, as opposed to our culture and environment.  This is extended to the argument that Darwinian theory implies that different ‘races’ necessarily are different in their inherent skills, temperaments and the like just as they are in their skin color and hair form.

Opposed to that is the Nurture view that argues that some physical traits and disease predispositions have genetic components, but that our basic nature and behavior are molded by our social, political, and economic contexts.  This view argues that measures of value and achievement, and behaviors leading to economic or family success or to addiction and antisocial behavior, are social constructs rather than reflections of hard-wired natures.

There is nothing new here, and in reaction to the eugenic movement and the Nazi horrors, which were rationalized in terms of Darwinian inherency, there was a strong turn towards what was coined as tabula rasa, the view that early experience makes us what we are.  This was how psychology was generally taught in the post-war generations.

Both views can be traced back to precursors, including Freud and many others for the Nurture side and of course Darwin and his relative Francis Dalton on the Nature side.  However, we are after all supposedly in the Age of Science, and it is a bit surprising that we have not moved very far at all with respect to the Nature-Nurture question.

Another way to say it
In fact, that lack of progress, despite enormous investment in supposed scientific research to answer the question, explicitly or implicitly, is not hard to document.  And the documentation goes beyond the ivory labs of scientists, to the broader public, where it has been of interest for more than a century.  Here is a way to express the same question, which I have just chanced upon:
Does there exist in every human being, beneath that outward and visible character which is shaped into form by the social influences surrounding us, an inward, invisible disposition, which is part of ourselves, which education may indirectly modify, but can never hope to change? Is the philosophy which denies this and asserts that we are born with dispositions like blank sheets of paper a philosophy which has failed to remark that we are not born with blank faces—a philosophy which has never compared together two infants of a few days old, and has never observed that those infants are not born with blank tempers for mothers and nurses to fill up at will? Are there, infinitely varying with each individual, inbred forces of Good and Evil in all of us, deep down below the reach of mortal encouragement and mortal repression—hidden Good and hidden Evil, both alike at the mercy of the liberating opportunity and the sufficient temptation? Within these earthly limits, is earthly Circumstance ever the key; and can no human vigilance warn us beforehand of the forces imprisoned in ourselves which that key may unlock?
Where does this somewhat stilted form of the issue come from?  It’s from  Wilkie Collins’ book No Name, published in serial form in Charles Dickens’ journal All The Year Round, 1862-3, just 3 years after Origin of Species (but with no reference to it, and Collins was not involved in science discussions at the time).  That was about 150 years or six generations of science ago!



No Name is a neglected masterpiece, a compelling novel about social conditions in England at the time (related to rules of inheritance, marriage, and illegitimacy.  It was not a commentary on science, but this paragraph reflects an essentially totally modern expression of the very debate that still goes on in the world generally, and in science specifically.

Where we are today
We have tons more facts, on both sides, but not a whit more nuance in the generally expressed views on the subject.  Indeed, do we have much more actual science or is each side just enumerating some additional data, often that barely deserves being called 'fact'?  One major reason for the persistence of no-answers is the inordinately dichotomous world-views of scientists, predilections based on various aspects of their technical specialties but also on their sociopolitical views.  Another reason is simply the tribal nature of humans generally, and in particular that of opinions about issues that affect societal policy, such as where to invest public funds in regard to things like education or welfare, how one views social inequality, the origins of proper (socially admired rather than antisocial) behavior, and the like.  We all have our perspectives and our interests, regardless of whether we work as scientists or in other occupations.  In light of this discussion I probably should be asking whether my world-view and career in science is the result of my genes, or the fact that there were lots of jobs (and science was more fun and less about 'productivity' and money than it is now) when I got my degree?

Not everyone sees this question in a completely polarized way, but instead propose Nature via Nurture or other acknowledgements of the role of both genes and environment (an obvious truth, since DNA is in itself basically inert), but if you look carefully you'll almost always be able to detect their political beliefs, and thus their predilection for one side or the other.  They pay lip service to environment but basically want to do a lot of DNA sequencing (or tests for 'epigenetic' changes in DNA), or they want to do sociological studies and minimize the genetic component (or opportunistically say they're testing a given variant in many environments).  We are all being trained as specialists with particular interests and science in general has been based on an enumerative reductionist approach that is not good at integrative studies of complex interactions.

The bottom line for me is that we should all recognize the uncertain nature of the science, even perhaps that science doesn’t pose its questions in ways that have scientific answers. But we also should recognize that behavior and attitudes towards it affect how society allocates resources, including status and privilege and power hierarchies.  For that reason, scientists should treat the subject with a bit more caution and circumspection—much as we do things like experiments testing whether genetic engineering of viruses could make them more pandemic, and other areas where what someone does in a lab, for curiosity or any other reason, can spill over to have implications for the lives of the rest of us—who are paying the bills for that science.

However, for the same reason that the research affects society, we can’t expect scientists, who are privileged members of that society, to monitor themselves.  And the persistence of the same important questions, from the beginning of the scientific age to the present, should teach a lesson in humility as well.

Friday, September 19, 2014

Faith in science? Industrialized agriculture and antibiotic resistance

Someone asked me the other day on Twitter whether I thought that the words "science" and "belief" were compatible.  I said yes, though I know that a lot of scientists think (...believe...) that faith has nothing to do with science.  Science is facts, faith is religion, based on sacred texts and the like, which are basically hearsay without empirically acceptable evidence.  But, the history of science indicates that this distinction is far from being so simple -- there was a time when people believed that the moon was made of cheese, diseases were caused by bad air, Newton was right about physics, the continents didn't move.  And these beliefs were based on empirical evidence, observation -- dare I say 'facts'? -- not mere guesswork.

In that light, two recent pieces about the role of agriculture in the rise of antibiotic resistance are interesting.  The New York Times described a new study in the Journal of Occupational and Environmental Medicine ("Persistence of livestock-associated antibiotic-resistant Staphylococcus aureus among industrial hog operation workers in North Carolina over 14 days," Nadimpalli et al.)
that reports that workers at industrial hog farms can carry antibiotic-resistant bacteria, Staphylococcus aureus, in their nostrils for up to four days.
Twenty-two workers provided 327 samples. S. aureus carriage end points did not change with time away from work (mean 49 h; range greater than 0-96 h). Ten workers were persistent and six were intermittent carriers of livestock-associated S. aureus. Six workers were persistent and three intermittent carriers of livestock-associated multidrug-resistant S. aureus. One worker persistently carried livestock-associated methicillin-resistant S. aureus. Six workers were non-carriers of livestock-associated S. aureus. Eighty-two per cent of livestock-associated S. aureus demonstrated resistance to tetracycline. A majority of livestock-associated S. aureus isolates (n=169) were CC398 (68%) while 31% were CC9. No CC398 and one CC9 isolate was detected among scn-positive isolates.
As the NYT piece notes, eight-six percent of this sample of hog farm workers carried bacteria for at least 24 hours, compared with about one-third of the non-farm worker population.

This is a problem because the resistant variety of S. aureus, MRSA, has made its way into hospitals and is responsible for thousands of deaths.  Further, many people believe that industrial farming is the cause of much of the antibiotic resistance that is now becoming such a problem, because animals are fed antibiotics to speed their growth, and many of those antibiotics are used to treat human diseases.  Indeed, the majority of the antibiotics used in the industrialized world are given to animals.  When bacteria on the farm become resistant to antibiotics, as this study shows, they don't necessarily stay on the farm.  How they spread has been difficult to document, but might include consumption of contaminated meat, and Nadimpalli et al. report another pathway.

Hog farm; Wikipedia
Responding to the increase in antibiotic resistance that many believe industrial farming to be responsible for, the US Food and Drug Administration this year put a voluntary ban on the use of antibiotics for growth promotion. Critics saw this as a weak response to a very large problem, but pharmaceutical companies and some farmers say it will do what it is meant to do; reduce the use of antibiotics for non-medical purposes, and thus reduce the possible evolution of resistant bacteria that are harmful to humans. Of course one always has to ask the political question of who wields the power and influence over any sort of decision that may affect a particular industry.

But much of this is controversial. Is agricultural use of antibiotics in fact to blame for the problem, or is it overuse of antibiotics by the medical system?  Indeed, there's less of a problem in, say, Scandinavian countries where for decades physicians have prescribed antibiotics at a much lower rate than they have done in the US. Do resistant bacteria really spread in considerable numbers from farm to city?  This may be less controversial with the publication of the Nadimpalli et al. paper, but critics will say that the sample size was small and anyway, documenting a mechanism doesn't mean this is what has happened.

We all tend to pick and choose facts to support our convictions.  Indeed, if you look at how scientists, in any field, cling to their explanations, 'convictions' is perhaps a muted term for what is being clung to.  How we think about these questions may well reflect what we believe more generally about the food system, how or even whether animals should be farmed for meat, whether we patronize farmers'  markets rather than industrially produced food, and so forth rather than what we, or anyone, actually know about the causes of antibiotic resistance.  That is, our personal sociopolitical positions seem clearly be correlated with, if not strongly influencing, our scientific position.

Yesterday, an opinion piece by Iowa veterinarian and pig farmer Howard Hill appeared in our local paper, and in papers around the country.  Hill believes that farmers are being unfairly blamed for antibiotic resistance in humans.
...the claim that "70 to 80 percent of all antibiotics sold in the United States each year are used in livestock" is a straw man. More than a third of those drugs aren't used in human medicine, another third are not considered highly important to human medicine, and most of them aren't used for growth promotion. Critics also ignore the fact that there are a lot more cows, pigs and chickens than people. In 2011, for example, 30 million pounds of antibiotics were sold for use in more than 3 billion livestock and poultry, compared with 7 million pounds for 311 million people, meaning each person used nearly five times more antibiotics than were used in each food animal.
Is he making selective use of the data?  Yes, but isn't everyone who talks about this issue?  And does that make our assertions wrong?  Doesn't prior belief influence our understanding of what the data show?

While Rome burns
President Obama yesterday issued an executive order aimed at combating antibiotic resistance.  The order accepts that industrial agriculture may have a role in increasing resistance, but it adds little to the FDA order of several months ago:
The Food and Drug Administration (FDA) in HHS, in coordination with the Department of Agriculture (USDA), shall continue taking steps to eliminate the use of medically important classes of antibiotics for growth promotion purposes in food-producing animals.
Not many teeth here.  Years ago Europe took much the same approach, requiring that the use of antibiotics for growth promotion be reduced, but a lot of reclassification of antibiotic use for medical purposes followed, as many expected in the US following the FDA announcement last December, which we blogged about here,  and again with this Executive Order.

Again in Scandinavia, the use of antibiotics for growth promotion has been banned, beginning in Sweden in 1986, but farmers have not suffered.  According to a piece in the BCMJ in 2011:
In 1986, Sweden became the first country to regulate the withdrawal of antibiotics used in food animal production. By 2009, Swedish sales of antibiotics for use in agriculture were reduced from an average of 45 tons of active substance to 15 tons. Sweden was followed by Denmark, the United Kingdom, and the Netherlands. 
Danish swine and poultry production continued to flourish with gradual reductions of antibiotic use beginning in 1992 and continuing to 2008 (latest data). During this time, Danish farmers increased swine production by 47% while reducing antimicrobial use by 51%. As well, poultry production increased slightly while reducing antimicrobial use by 90%. Denmark remains one of the largest pork ex­porters in the world.
So, whether or not growth promoting antibiotic use in animals is a major cause of resistance is not really an issue, and we needn't even continue to have the discussion.  If there is any chance it is, why not ban it entirely?  Experience in Scandinavia suggests there won't be dire economic consequences -- unless you're a pharmaceutical company making antibiotics for animals.

Faith in science
We have often written here about the economic interests that drive the course of Big Science.  Can we have faith in science if there is considerable faith in science?  People are, after all, only human, and people of all faiths, including science, defend their faiths.  Further, it's often impossible to disentangle belief from vested interest.   If you've got a hammer, or a hammer to sell, everything looks like a nail.

Thursday, September 18, 2014

Malaria control now may not foretell the future

There are hailstorms, landslides, droughts, malaria and...the State. These are inescapable evils: such there always have been and there always will be.
                     Carlo Levi, Christ Stopped at Eboli: The Story of a Year, 1945
Malaria was once endemic in southern Europe, the UK, and the Americas.  It was in Greece at least 4000 years ago, and reached the Americas with or shortly after Columbus, eventually becoming an 'inescapable evil' when the slave trade was at its height.  With mosquito control due to DDT, cleaning up of standing water and other measures, the disease was essentially eliminated in the US and Canada and some parts of South America by 1950.  But the factors that maintain malaria in a population are complex, and it's possible that with climate change, global transfer of goods and increasing immigration from endemic areas, this, and other mosquito-borne diseases could return.

A recent episode of the BBC Radio 4 program, Costing the Earth, discussed the increasing spread of a number of mosquito species in the UK and throughout Europe.  In particular, the Asian tiger mosquito (Aedes albopictus), a voracious biter, has been spreading north from the Mediterranean for several decades.  It doesn't carry malaria, but it is a vector for other significant diseases including Yellow fever, Chikungunya, and nematodes that cause filariasis.  

Asian tiger mosquito; Wikipedia
The Costing the Earth episode pointed out that the mosquito is spreading for multiple reasons, perhaps a perfect storm of causation: temperatures are warming, sustained drought in the UK has meant that more and more people are collecting rain in buckets, to use in watering their gardens, and these standing pots of water have turned out to be a great reservoir for mosquito breeding, there is sustained wetland restoration happening in the country, tires shipped into the UK are also frequently full of stagnant water and thus mosquitoes and a number of species are entering the country via this route.  And so on.  But, mosquitoes that carry and transmit malaria still live in areas where malaria is no longer endemic, including the UK and the US.  So, it's not the absence of the vector that explains the absence of disease.

The dynamics of epidemics are well-understood, in mathematical, ecological and cultural detail (Aeon magazine has just published an excellent and accessible description of the mathematics of epidemics).  In the case of malaria, generally speaking, to maintain the disease in a population the population must be over a given size, there must be a reservoir of infected individuals for the mosquitoes to feed on, there must be a large enough mosquito population to transmit the disease to enough susceptible individuals, the fatality rate can't be so high that the parasite is quickly eliminated by death, environmental conditions must be favorable to the vector, and so on.  And, cultural factors must be favorable as well, so that, e.g., mosquitoes can find people to feed on or to infect at the right time of day or night.  That is, there are multiple factors required to maintain disease in a population, and often the infection can be halted by interfering with any one of them.

So, eliminating the vector, mosquitoes that carry the malarial parasite, is one approach to eliminating malaria, and as anything that breaks the chain of infection, vector control can successfully control the disease.  But, it's not required.  Reducing the prevalence of the disease to a level at which the host/vector ratio is no longer sufficient to transmit the disease widely, as was done in the US and Europe in the 1940's, is another way.  So, we in non-endemic areas of the world live with, and are bitten by potential vectors with no fear of infection with malaria, though now West Nile virus and Chikungunya are another story.  Indeed, this map shows the distribution of vectors or potential vectors around the globe.

            Global Distribution of Dominant or Potentially Important Malaria Vectors

From Kiszewksi et al., 2004. American Journal of Tropical Medicine and Hygiene 70(5):486-498, via the CDC.
Thus, where malaria has been eliminated, continued control depends on there being not enough infected individuals to sustain the infection.  Of course, this could change.  Southern Europe may be seeing the re-introduction of malaria, for example -- the disease has been spreading in Greece now for the first time in four decades.  In this case it can be blamed on the economic crisis and austerity because the government has not been able to pay for mosquito spraying, but the spread of malaria can't be blamed on mosquitoes alone; they've got to have an increased number of infected people to feed on as well. And it's not just malaria -- other heretofore neglected, unexpected, tropical diseases are reaching the shores of the US, and Europe, Chikungunya being one example.

We live in a dynamic, changing, interconnected world.  Malaria rates seem to have been declining in Africa and Asia, and research into better prevention and control is ongoing, and researchers can claim many successes.  But, like the proverbial fluttering of the butterfly wing that unpredictably causes climatic chaos far away, if innocuous acts like saving rain water in the UK can have widespread and unpredicted effects like increasing the spread of previously unknown mosquitoes, and thus potentially disease, it's hard to reliably predict that malaria will remain controlled in currently non-endemic areas.  This is not to be alarmist, but simply to point out that it's a bit hubristic to believe we can predict anything so complex, particularly when it requires predicting future environments.  We've said the same many times before about predicting complex disease from genes.

Carlo Levi (1902 - 1975) was an Italian painter, writer and physician.  Because of his political activism during the fascist era, he was exiled to a small southern town in Lucania, where he spent several years painting, writing and attending to the medical needs of the inhabitants there.  He wrote about this time in his book, Christ Stopped at Eboli, published in 1945.  It's a fascinating story; political, ethnographic, scientific (or quasi so).  Malaria was a fact of life in southern Italy at the time,  and Levi mentions it often in the book.  Including in this passage:
In this region malaria is a scourge of truly alarming proportions; it spares no one and when it is not properly cared for it can last a lifetime.  Productive capacity is lowered, the race is weakened, the savings of the poor are devoured; the result is a poverty so dismal and abject that it amounts to slavery without hope of emancipation.  Malaria arises from the impoverishment of the deforested clayey land, from neglected water, and inefficient tilling of the soil; in its turn it generates in a vicious circle the poverty of the peasants.  Public works on a large scale are necessary to uproot it.  The four main rivers of Lucania ... besides a host of lesser streams, should be dammed up; trees should be planted on the mountainsides; good doctors, hospitals, visiting nurses, medicines, and preventive measures should be made available to all.  Even improvements on a limited scale would have some effect...  But a general apathy prevails and the peasants continue to sicken and die.  
Levi may not have entirely understood the cause of malaria, but he clearly understood the vicious cycle of malaria and poverty, which he witnessed every day, all around him in exile.  As Dan Parker has written here on MT numerous times, economic development itself may be one of the best preventives we know.  But it may not always be enough.  It doesn't much matter which link in the chain of infection is broken; once repaired, we may have to figure out new ways to break the chain again.

Wednesday, September 17, 2014

Antibiotic resistance: Move the money (and control it)!

The BBC Radio4 program Discovery had a two-part series (August 18 and 25th) on the real health danger that we face and the research challenge it presents.  No, not Big Data mapping of diabetes or cancer, or athletic ability or intelligence.  Instead, they were about an impending biomedical disaster, one that essentially trivializes much of which we are throwing away resources on: antibiotic resistance.

Growing antibiotic resistance seems to be imminent or at least inevitable, both in terms of issues like treatment of disease in hospital patients, and in the control of spreadable diseases.  This doesn't seem to be too speculative.  Some strains of infectious bacteria are already resistant to multiple antibiotics, and these are often contracted by hospital patients who were there for some non-infectious reason, and some infectious diseases are not responding to antibiotics at all.

If we no longer have antibiotics, of course, the simplest infection can again become life threatening again, surgery, chemotherapy, kidney dialysis, even an ear infection will become risky again, and infectious diseases will again be the killers in the rich world that they once were.

The antibiotic Novamoxin; Wikimedia Commons

Pharmaceutical firms simply aren't investing in antibiotic development as needed.  Not surprisingly, the reasons are commercial: an antibiotic that becomes widely used may be profitable, but not nearly as much as anticancer agents, or recreational drugs like Viagra, or anti-balding cream.  And, if it's widely used the cost may be lower but resistance is sure to evolve.  If saved for the most dire cases, then sales will be low, cost too high to bear, and not enough profit for the company.

The cost of development and testing and the limited duration of patent exclusiveness present additional issues.  So, nobody is investing heavily in antibiotic development, not even governments that don't have quite the greedy commercial motive for what they do.

The Ebola epidemic is another biomedical disaster that has caught international medical care unprepared.  This is a virus, but there is basically no known antiviral agent; one with some effectiveness seems to be in the works and there are some other stop-gap strategies, but nothing systematic.  But the problem, dangers, and challenges are analogous to the fight against pathogenic bacteria.  Indeed, lately there's been discussion of the possibility--or inevitability?--that Ebola will evolve an ability to be transmitted via the air rather than just physical contact with infected persons.  But of course this is a repeat of the story of SARS, MERS, and other emerging infectious diseases, and surely not the last.

So the question for potential investigators or those worried about the looming disasters becomes: where is the money to solve these problems going to come from?  The answer isn't wholly simple, but it isn't wholly top secret either.

Move the money!
Developed countries are spending countless coinage on the many chronic, often late-onset diseases that have fed the research establishment for the past generation or so.  These are the playgrounds of genomics and other 'omics approaches, and more and more resources are being claimed by advocates of huge, long-term, exhaustive, enumerative 'Big Data' projects--projects that will be costly, hard to stop, and almost certainly will have low cost-benefit or diminishing returns.

We already know this basic truth from experience.  Worse, in our view, many or even most of these disorders have experienced major secular trends in recent decades, showing that they are due to environmental and behavioral patterns and exposures, not inherent genetic or related 'omic ones. They do not demand costly technical research.  Changing behavior or exposures is a real challenge but has been solved in various important instances, including iodized salt, fluoridated water, the campaign against smoking, urban pollution, seat belts/air bags, cycle helmets and much else.  It doesn't require fancy laboratories.

Unfortunately, if we continue to let the monied interests drive the health care system, we may not get antibiotic development.  The profit motive, evil enough in itself, isn't enough apparently, and some of the reasons are even reasonable.  So we have to do it as a society, for societal good rather than profit. If funds are tight and we can't have everything, then we should put the funds we have where the major risks are, and that is not in late-onset, avoidable or deferrable diseases.

Let's not forget that the reason we have those diseases is that we have enjoyed a century or so of hygiene, antibiotics, and vaccines.  The 'old man's friend', pneumonia and things like it, were put at bay (here, at least; the developing world didn't get the kind of attention we pay to ourselves).  But if we dawdle because we're so enamored of high-tech glamour and the sales pitches of the university community (and the for-profit pharmas), and because of our perfectly natural fear of the complex degenerative diseases, we could be making a devil's bargain.

Instead, what we need to do is move the funds from those diseases to the major, globally connected problem of infectious diseases (and similar problems combating evolving pests and infections that affect our food crops and animals as well).  We need a major shift in our investment.  Indeed, quite unlike the current Big Data approach, combatting infectious diseases actually has a potentially quick, identifiable, major payoff.  Some actual bang for the buck. We'll need somehow to circumvent the profit and short-term greed side of things as well.  Of course, shutting down some labs will cost jobs and have other economic repercussions; but the shift will open others, so the argument of job-protection is a vacuous one.

"What?!" some might say, "Move funds from my nice shiny omics Big Data lab to work on problems in poverty-stricken places where only missionaries want to go?" Well, no, not even that.  If plagues return, it won't matter who you are or where you live, or if you have or might get cancer or diabetes or dementia when you get old, or if you've got engineered designer genes for being a scientist or athlete.

The battle to control infectious diseases is one for the ages--all ages of people.  It perhaps is urgent.  It requires resources to win, if 'winning' is even possible.  But we have the resources and we know where they are.

Tuesday, September 16, 2014

Akenfield, and lessons for now-age sustainability movements?

In the 1960s I was stationed as an Air Force weather officer in eastern England (Suffolk, or East Anglia).  I had my off-base lodgings in the intellectual town of Aldeburgh, on the shingle-beach of the North Sea coast.  Aldeburgh is a North Sea fishing town, but more notably the home of the distinguished composer Benjamin Britten, and was a long-time or passing-through place of many notable artists, writers and scholars in the early 20th century.  But Aldeburgh is something of an exception: East Anglia is basically a kind of wetlands rural agricultural area--scenic if you are just passing through, but a place of farming business if you live there.

Aldeburgh village and beach (Wikipedia)

In 1969 the author Ronald Blythe published a book, Akenfield: Portrait of an English Village, of reminiscences of Suffolk folk of various ages and professions.  That was when I was living there, but I didn't learn of the book until recently.  Akenfield is a fictitious name for a village, but the book's stories, told by the locals, are real.  This book is an evocative one, capturing the mood--and change--of an English village's way of life, as seen by people of all ages and occupations.  In your mind's eye, you can hear the birds and the livestock call, and see the farmers, shepherds, smiths and so on plying their trades.

Those familiar with Wendell Berry's work about American farm life, or Aldo Leopold's work on Nature and the American landscape, largely about the Midwest a half-century or more ago, will find Akenfield to have a similar mix of nostalgia by the old-timers, commitment by forward-looking younger people, deep love and dedication for the land, yet recognition of the harsh realities of the onset of industrial farming and the leaving of the land by the young, who headed for better-paying jobs in urban trades and factories.

Suffolk farm by Edward Seago, 1930s (Wiki images)
Tractors replaced horse-drawn plows, the many farm laborers were replaced by machines.  Posh landowners have been replaced by more business-like owners.  Produce and livestock are processed through the landscape on a rapid, no-nonsense (and generally no sentimentality) scale, unlike the mixed, small-scale less commercial farming that had come on.  At the time, the villagers largely located themselves in relation to the two World Wars that had so affected England: their roles in the wars, rationing and hardship, and so on.

That was then....and still is, now
By the 1960s, large-scale business-farming had taken root.  Many of the issues discussed by the Akenfielders would sound the same today:  animals being treated in what for humans or pets would be considered horridly inhumane ways, people being driven off the land by machinery, generalists or money people replacing skilled craftsmen, the new rough treatment of the land compared to the mixed-crop smaller-scale farming of the past.  Chicken and hog farms already had become jails for their inhabitants who may never see the light of day in their short, measured, lifespans.

1969 was nearly 50 years go!  In Akenfield in the '60s there were a few who clung to the older ways, who loved the land and refused to leave it, whose needs were simple and commitment great.  This was not for political reasons, but for local traditional ones.  I can't say much about how things may have changed in East Anglia since then, except that my last (also nostalgic) trip through there to Aldeburgh was in 2006, and the hog lots one passed were large.  No rustic slow-paced life!

These musings strike me as relevant to much that is happening today.  Industrial, now genomic-driven agriculture is dominating and many will say devastating not only the nature of agricultural life but also the land itself.  Soil is being lost, monocropping risks major pest devastation, and large farms have become huge impersonal businesses.  And of course livestock practices are every bit as inhumane as ever they may have been.  Of course the argument now, as then, is that more is being produced to feed more people (and there are now a lot more to feed in the world).

At the same time, some are trying to raise the alarm about what may happen if this continues.  Under the banner of 'sustainability', people are attempting to organize resistance to the Monsantification of the land, as one might put it.  There are small farmers who sell humanely raised, local, often organic, small-scale farm products.  There are those trying to use the land in a long-term sustainable way.

Is it pushing analogy too far to liken these scattered and often struggling movements to those who held on to traditional life a half-century ago?  They passed from the scene (as did some protest-era movements, such as communes, 'small is beautiful', and other similar movements in the '70s protest era). Will the current movements flourish, or are they like the trades of old, destined to pass into history?  If they do, will the industrial model sustain life, or destroy it?