Wednesday, March 31, 2010

Reductionism, part III: Self-reinforcing idealization in science

We've pointed out the history of genetics as a highly empirical research program that has led to an unprecedented century of new knowledge of a fundamental nature, about genetic causation. But we pointed out that this has been brought about by clearly intentional tunneling through what was known from the beginning as the greater truth.

Even today, the same logic used by Mendel, and the same intentional tunneling, is the basis of genetic research. It is a form of self-reinforcing idealization. The method is strengthened by its discoveries, and the tunneling reinforced to keep out 'extraneous' aspects of the truth, even if they may be important to an overall understanding. That's why there are problems using this method to understand the genetic basis of complex traits.

Complex traits are the result of interactions among many functional aspects of DNA, in a highly orchestrated pattern of gene usage in space and time, as an embryo develops from a fertilized cell to an adult. This is intimately related to evolution since different species get that way by developing differences in their structures. The understanding of the latter is a field called EvoDevo.

Geneticists from Mendel to today have focused on clear-cut, transmissible genetic effects, stereotyping what genes are and what they do, to get a handle on various questions about genetic function. On their part, developmental biologists have done something similar. They have divided the continuously changing development of an organism into various stages, treating them as if, like genetic effects, they were distinct, discrete entities.

This is very similar to the stereotyping of genes and genetic effects that have characterized genetic research from Mendel to Morgan to today. Development and EvoDevo use similar tunneling through reality to try to understand the genetic basis for the origin of complex structures. This ignores the obvious and well-recognized fact that stages and structures are not independent. Evolution shows this in many ways, as there are developmental differences among closely related species, that are brought about by different genetic mechanisms (or their timing).

We were led to relate the two kinds of tunneling through reality by conversations Ken had with Alan Love on his recent visit to the University of Minnesota. Alan alone, and also in collaboration with Rudy Raff of Indiana University, have written interesting papers on this subject, and on the way that this kind of tunneling (our term for it, not his) has both helped and restricted the way we view development and its evolution.

Science may work today only because of self-reinforcing idealization--the ability to tunnel through complexity by focusing on simple aspects of more complex wholes. Whether a new thought process may eventually be called for, to broaden the tunnels, or whether tunneling is the best approach to understanding Nature, isn't clear. But it is clear that until a better way comes along, we will continue to tunnel.

Tuesday, March 30, 2010

Reductionism, part II -- The Tunnel of Love

Choosing to wear blinkers
Yesterday, we suggested that even the early geneticists were well aware of the multifactorial causation of traits, and asked how the 'gene for' thinking that has driven so much recent research, largely without satisfying results, came to predominate as it does currently. Today we suggest that science restricts its view intentionally, as a pragmatic way of discovering the nature of aspects of Nature. Indeed, the early geneticists did the same. And we point out that the price we pay is the way scientific methods restrict the degree to which we understand things more broadly.

'Gene-for' thinking is pragmatic -- understanding the molecular basis of genes and how they work is easier than understanding, say, polygenic interaction or the effect of the environment on development.  And of course great progress was made in molecular genetics throughout the 20th century, which only reinforced the view that the molecule was the thing. This, coupled with formal population genetics, gave researchers rules (how genes segregate, how DNA codes for proteins, and so forth) for cataloging how genes work, giving the field a theoretical framework within which to plan, execute and interpret experiments.

Discoveries in other fields sometimes reinforced the determinist view as well. Following not long after the 'one gene one enzyme' dictum was hypothesized by Beadle and Tatum in 1941, e.g., the coming of the computer age underscored the view of genes as the program or blueprint for life, an appealing and seductive idea that is yet to die. Even if a blueprint needs an architect, and a foreman to supervise the building.

In fact, the idea that the early geneticists had a broader view is only partially true. That they did is well-documented, as we wrote yesterday, but it wasn't ever really put into practice. Then as now, experiments were conducted in a way that enabled these guys to find single genes that 'caused' the traits they were interested in; environment was controlled, and fruit fly lines homozygous for a trait known to be due to the effects of a single gene were crossed so that the effect of a given gene could be assessed, just as Mendel had done with his pea plants.

This is how Morgan mapped genes. And, why those genes were given names like 'hairy wing', 'small eye', 'small-wing', 'vermilion', as though they were the single cause of or were 'for' these traits, even though Morgan knew full well that wing characteristics or eye color were due to many genes. Indeed, he wrote in The Theory of the Gene, "... it may appear the one gene alone has produced this effect. In a strictly causal sense this is true, but the effect is produced only in conjunction with all the other genes."

The question thus becomes a philosophical one about causation. Philosopher of science Ken Waters has written a nice paper about this*, discussing the difference between 'potential' and 'actual difference makers' and how experimental method determines which is found, while prior assumptions determine which are sought. Although Morgan knew that it took many genes to change eye color in flies -- potential difference makers, in Waters' terminology -- the gene that actually changed eye color in his experiments, a direct consequence of the way he conducted them, was the 'vermilion' gene. The actual difference maker. The foundation for gene-for thinking was well-established right from the beginning, and reinforced all along the way.

And, of course, the idea that some of the early eugenicists may have understood that environmental influences could be important in development didn't prevent the Nazis from making life and death decisions based on heredity.

'Gene-for' fervor takes off -- and people actually believe it
After the discovery of the gene for cystic fibrosis in the late 1980s, genes for more than 6000 single-gene disorders were quickly identified. These are largely rare, pediatric diseases, but even so there seemed to be little reason to assume that geneticists wouldn't continue finding genes for disease, and then even for behavior and other kinds of 'normal' traits, even if an important aspect of pediatric disorders is that they occur near birth and hence are relatively less susceptible to environmental effects (not entirely, of course, because even the uterine environment can vary).

This kind of success at finding genes associated with traits was seductive, and the commitment to strong genetic determinism is now found not only among geneticists, but among epidemiologists, psychologists, economists, political scientists, and even further afield. Epidemiology, e.g., had its own history of success finding the cause of infectious diseases, as well as the effects of environment risk factors like asbestos or smoking. But, as with single-gene disorders, when the effect of a risk factor is large, it's a lot easier to find than when there are many cumulative risk factors, some genetic and some environmental. When epidemiology turned to common chronic conditions like heart disease, asthma or diabetes, which generally don't have a single strong cause, they ran into the same kinds of epistemological and methodological difficulties that geneticists were having with these same complex diseases.

Ironically, out of frustration with the difficulty of finding environmental causes for many chronic diseases, epidemiology turned to genetics, and the field of genetic epidemiology quickly grew. Only to be as stymied in terms of the fraction of cases explainable by known genes. The 'strictly numerical basis' upon which Morgan had identified so many putative genes was no longer good enough. Because the counts don't come out in Mendelian terms unless fudge factors (called 'incomplete penetrance', a determinist idea itself, as it imbues the gene with a mystical ability to be more or less expressed) are added to account for other causes relative to a gene under study, usually meaning the gene accounts for only a small fraction of cases and doesn't have nearly Mendelian ratios among siblings, etc.

And it becomes institutionalized
Then of course when the Human Genome Project was finished, the sequencing factories had to be kept running, so yet more promises were made about what we were going to be able to do with genes, more billions were spent on more classically reductionist 'count only' genetics -- and yet the same problems remain unsolved. We still can't enumerate the genes that are responsible for height, and for exactly the reasons Morgan spelled out in 1926, as we noted yesterday.

In fact, the scientific methods that we use identify genes that, when mutated in some ways, cause serious stature problems (Marfan's syndrome makes you very tall, many genes make you very short), but when we look at the normal range as seen in a sample of healthy people, these genes do not generate mapping 'hits' (as in GWAS association studies). And this is probably true of most traits -- it's easier to explain the extremes of their distribution than it is to explain the normal range.

It's not that genes are unimportant. It's that we are only taking into account part of the truth. This is driven essentially by methodological considerations -- we've got well-developed formal theory for genes and how they segregate in families, and what that means about how to find them. But it is more of a struggle to account in useful ways for complex causation -- useful, at least, in terms of dreams of miracle drugs or genetically focused personalized predictions.

Tunneling through the truth
An important reason for the combination of great success in discovery in genetics, and the relatively great failure to account for complex traits has to do with methodology rather than the state of Nature. As we said yesterday, the focus on fixed, chromosomally localized causal elements -- 'genes' in the classical sense -- was driven by Mendel's careful choice of experimental material, followed by similar constraints employed by Morgan and the other classical geneticists of the early 20th century.

The very same logic and approaches have been followed to this day. Genes are identified as localized causal elements in DNA and we study and manipulate them through variation that is studied, as much as possible, by removing all other sources of variation. Traits are narrowly defined, transgenic experiments use inbred animals manipulated one gene (or one nucleotide) at a time, and so on. This is done because it generates a cause-effect situation that is tractable.

That approach, or research program, led to the steady discovery of the nature of DNA, of genes as protein codes, and so on, up to the mapping of entire genomes of a rapidly growing number of species.

Yet at the same time, when it comes to complex traits, we know that we are not discovering the whole truth -- and we know why. It is the same control of variation that led to discovery, that leads to obscuring the whole nature of Nature.

In a sense, what science does is to 'tunnel' through reality. Like any other tunnel, the walls are reinforced to keep things outside the tunnel out, and to make a clear path within. The path is a particular gene we are interested in, and we manipulate that gene, and its variation, treating it as a cause, to see what effects it has. We know it really interacts with the world outside, but we standardize that world as much as we can, to reveal only the effect of variation in the single cause.

This is perhaps a tunnel of love of experimental design, but not so much of the nature of Nature, because by particularizing findings, even on a large scale, we systematically isolate components from each other whose true essence, and origins, are intimately dependent on their interactions.

Maybe tunneling through truth is the only way science can understand the world. From the point of view of garnering facts, and manipulating the world by manipulating the same facts, science is a huge success. But in terms of understanding Nature, maybe we need a different way. If so, as long as the reductionist legacy of the 300 year old Enlightenment period in human history lasts, we will remain the Tunneling species.

Tomorrow we'll discuss how the same kind of thinking has worked in developmental genetics and the EvoDevo world of research.

-------------------------
*The Journal of Philosophy is only available online to members, but the reference is Waters, "Causes That Make a Difference", The Journal of Philosophy, 104:551-579, (2007).

Monday, March 29, 2010

Genetic reductionism, part I -- the path from broad to narrow?

What geneticists used to know
We've been reading the writings of some of the early geneticists, and have been struck by how many of the original concepts and how much of the jargon are still in use even after a century of major discoveries. Even many of the names TH Morgan gave to fruit fly genes in the 1910s and 20s, without knowing anything about the structure of genes and how they work, are still in use today. Their names, in fact, are a major reflection of the way that science actually works.

The figure to the left, a map of the four fruit fly chromosomes, with gene names, is from Morgan's book, The Theory of the Gene, published in 1926. In that book, he outlined his theory as follows:
The theory states that the characters of the individual are referable to paired elements (genes) in the germinal material that are held together in a definite number of linkage groups; it states that the members of each pair of genes separate when the germ-cells mature in accordance with Mendel's first law, and in consequence each germ-cell comes to contain one set only; it states that the members belonging to different linkage groups assort independently in accordance with Mendel's second law; it states that an orderly interchange--crossing-over--also takes place, at times, between the elements in corresponding linkage groups; and it states that the frequency of crossing-over furnishes evidence of the linear order of the elements in each linkage group and of the relative position of the elements with respect to each other.
Note that his theory of the gene rests almost entirely on the work of Mendel, fifty years before. By good luck (given what was known at the time), Mendel studied traits that were not closely located ('linked') on the same chromosome, so that Morgan's group was working with and expanding on rather then testing or challenging Mendel's theory.

Mendel knew that systematic hybridization experiments would elucidate patterns of transmission of the 'elements' that were responsible for traits. But only some traits; others were too complicated, and didn't follow the same patterns. Mendel tried the same approach in other plants and found that some did not follow his rules, in fact; but the doubts that led to were overlooked given his overall success. We now know that traits he avoided did not 'segregate' in the way the traits he chose did, because they are due to the joint contribution of many different genes, known today as polygenes.

Morgan presents his theory of the gene, and then adds the following, and this is important with respect to how they could know as much as they did without understanding much at all about genes:
These principles, which, taken together, I have ventured to call the theory of the gene, enable us to handle problems of genetics on a strictly numerical basis, and allow us to predict, with a great deal of precision, what will occur in any given situation.
By 'strictly numerical', he meant that one need not understand what the genes were, in chemical terms, or how they worked. The rules of inheritance were made manifest through the relative numbers of different types of offspring of a given set of parents -- which Mendel first showed, and others built upon.

But Morgan didn't suppose that all traits were due to single genes. He was aware that most were due to many genes, and that it was likely that most genes do more than one thing.
A man may be tall because he has long legs, or because he has a long body, or both. Some of the genes may affect all parts, but other genes may affect one region more than another. The result is that the genetic situation is complex and, as yet, not unraveled. (The Theory of the Gene, p 294).
And, these early geneticists also knew about the fundamental contribution of the environment. Morgan ends the paragraph above with this sentence: "Added to this is the probability that the environment may also to some extent affect the end-product."

A student of Morgan's, A.H. Sturtevant, in his A History of Genetics (1965), says:
With Johannsen [who introduced the words "gene", "genotype" and "phenotype" in the early 1900's] it became evident that inherited variations could be slight and environmentally produced ones could be large, and that only experiments could distinguish them.
And:
In 1902 Bateson pointed out that it should be expected that many genes would influence such a character as stature, since it is so obviously dependent on many diverse and separately varying elements. This point of view was implied by Morgan in 1903 (Evolution and Adaptation, p. 277), and by Pearson in 1904.
This is among other early examples he offers. Even the most widely used genetics textbook in the 20s and 30s, a book in fact that helped school the Nazis (and a chilling read today), described variation in traits that were due to environmental factors. That book was Human Heredity, by Baur, Fischer and Lenz, first published in Germany in 1921, and revised a number of times. Two of their examples of traits with environmental contributions, are pictured to the left. The pigs in the top photo to the left are from the same litter. The smaller one was poorly nourished, while the larger one was well-fed. As for the rabbits:
Among rabbits, for instance, there are two somewhat similar races, one of which is pure white with pink eyes, while the other (the "Himalayan" rabbit) though mainly white and whit pink eyes, has very dark fur on the ears, paws, tail, and nose. The colour of the fur in this latter race is modifiable by temperature. If an area of its skin be kept cool, which can easily be effected simply by shaving a part, all the new hairs that grown upon the cooled area have a dark tint. Fig. 6 shows how a patch which had been thus shaved has been covered with dark hair. But as soon as the fur has regrown, so that the area of skin is now protected by it from the cold, all the new hairs which subsequently grow in the area are white, as before, so that the dark tint of the shaved area gradually disappears. It would be easy enough, by shaving the whole surface of the body, to provide one of thees rabbits temporarily with a dark-tinted fur throughout... P 33-34.
So, how did this broad view of genetics become so much narrower as the field matured, so that genetic reductionism that now predominates to a great extent? Indeed, the Human Genome Project and its ongoing sequels epitomize this approach, as its supporters promised that knowing our genes would lead directly to disease prediction (and prevention, and even dramatically increased longevity as a result). Whether this was truly believed, or was cynical spin to get funding -- in fact, it was a mixture of both -- even the intellectual forebears of today's geneticists knew it was wrong.  Or was the early view never as broad as it seems?

In our next couple of posts we'll have some ideas on this. You may disagree, or have other ideas, and if so, we'd love to hear them.

-----------------------
This post was stimulated by work on a paper about the developmental genetics of complex traits like the mammalian skull, and by interactions Ken had on his recent trip to the University of Minnesota, with two philosophers of science Ken Waters and Alan Love.

Friday, March 26, 2010

Polymeadows Farm hits the big time!

I've written before about Polymeadows Farm, the dairy goat farm run by my sister and her husband, with invaluable assistance from their partner, Hank.  As regular readers know, they've spent this year building a dairy processing plant on the farm, and developing a market for their products.  Yogurt, smoothies, soft cheeses, milk, chocolate milk -- Hank, drives all over New England, and into upstate New York taking samples to potential buyers, and then delivering their orders.  They now have markets as far away as Albany, NY, Burlington, VT, and Northampton/Amherst, MA.  And, a distributor's truck is about to start picking up at the farm (this is a great development, as, sadly, there is only one Hank) to deliver to another 50 stores around New England.  And they are getting rave reviews -- one store sells 40 different brands of yogurt, and reports that Polymeadows goat yogurt is their best seller!  A distinction accomplished in less than a year.


Hank even found a distributor who was interested in selling to markets in New York City!  She took a few orders down to the city, but it wasn't clear to Hank or Jennifer who was doing the buying.....until Jennifer got an email from a customer saying she'd bought some Polymeadows yogurt at Zabar's, in the city, and loved it.  That would be the Zabar's!  Check out the picture!  And if you're in New York, go by and get a taste of the country, brought to you by Jennifer, Melvin, Hank and a very well-tended herd of dairy goats!

Thursday, March 25, 2010

The nature/nurture non-dichotomy

Are we on the verge of a swing back to environmentalism? Otherwise known as the 'nurture' of the 'nature/nurture' dichotomy. Sunday's New York Times Book Review has a piece on a new book by David Shenk, called "The Genius in All of Us: Why Everything You've Been Told About Genetics, Talent, and IQ Is Wrong". We haven't read the book, but the review by Annie Murphy Paul describes it as a book that assures us that we can all be brilliant, if we only work at it. It's not about innate potential anymore, it's about hard work. Talent isn't a thing, she reports, it's a process.

The problem with this, of course, is that there is no nature/nurture dichotomy. It's not either/or, it's both.

Years ago, before technology enabled us to identify genes and their variation with precision, the dogma was that things are mostly environmental -- and many things are set in very early infancy (Freud and all that). Antagonism to genetics was intense, and medical students were discouraged from going into genetics because there was no future in it. The environment-is-all fervor was just as mistaken, and just as ideologically encased, as the genes-are-all fervor is today. Then technology arrived and the pendulum swung all the way in the other direction, where it's been for a while now.

So, if we do swing back in the nurture direction, we'll have to start writing posts about the errors of environmental determinism.

Wednesday, March 24, 2010

A bullet in the head

He was only 35 years old, when he wrote a few cordial notes to friends and family. He carefully folded them, addressed their envelopes, and then called the police to let them know where to find him. Then he put a bullet through his head.

Our young acquaintance suffered from Marfan syndrome. This is a genetic disorder in which for most instances the gene is known. It leads to disproportionately tall stature, and a variety of health problems. They often affect the heart and its major vessels. They contribute to a life of pain, and the need for repeated major invasive surgery, such as to patch aneurysms in the aorta, the main blood vessel running through the body. At present, this surgery is necessary throughout life--in our friend's case every two years forever. Marfan victims suffer a not surprising range of social and adjustment problems as well. There is no known cure.

This is a clearly genetic problem, with focused tissue effects. It should be a suitable target for genetic research if ever there was one. Of course, there are geneticists working on ways to engineer a patient's way out of Marfan syndrome, and hopefully measures can be developed to replace a defective gene in a mutation carrier, though upwards of 30% of cases are due to mutations that arise anew, and need a different approach (such as the one described here, just the kind of application of genetics that we're talking about).

This is what genetic research should be doing, and on a much more intensive scale, for the many diseases that really are genetic. Instead of the huge and swelling investment in large association studies for traits that are mainly due to lifestyle factors, whose fraction of genetic 'cause' is due to tiny variation in hundreds of genes. Geneticizing things that aren't really genetic draws attention, funds, and talent away from the real problems--from research in genetics for things that are truly genetic, and effective prevention for the major, common traits that aren't.

It is instances like our friend's that motivate our criticism of the system that insists on major investment in genetics, for everything and anything, when for most of us genes aren't responsible for our major health problems. If the same funds were to go where they might do some much greater good, traits that are genetic could be those that could really be prevented.

And then fewer people would feel they have to put a bullet through their head.

Tuesday, March 23, 2010

Last Suppers getting bigger? Um....

Here's a story that hits the spot, at least judging by the play it's getting all over the web today. Sophisticated computerized comparisons of the the size of the bread relative to Jesus's head in 52 paintings of the Last Supper produced over the last 1000 years show it to be ever expanding. Supersizing is not new!

But, hold on. Even if this study (published in the reputable International Journal of Obesity) were to represent some truth about food quantities over the last 1000 years, what relevance does it have to our recent obesity epidemic that has only taken off in the last 50 years?

Cute, chubby, and doomed?

Cute, cuddly, and ..... catastrophic? That may be the case regarding babies, if the latest hot-news bulletin is to be believed. As early as in utero, babies may become prone to obesity that can have later health consequences. Of course, under normal conditions chubby babies are likely to be resilient, able to call on an energy reserve. But in our current society perhaps this not just predisposes to later obesity, but according to this particular study could predispose to lifelong obesity. The issue is important to those who think that eating habits can be set in school. But school may be too late.

There are lots of issues at play here. One, however, is particular important when it comes to the GWAS religion: the idea that meaningful risk for complex traits can be based on genotypes. Part of the evidence is the widespread observation that most traits (and obesity is one) have substantial heritability--that is, substantial resemblance among relatives. That suggests genetic causation.

But if chubby mothers, or mothers' eating habits, affect their fetuses, then not only may this predispose the little darlings to a hefty lifetime, but it could also induce resemblance among relatives. That will be interpreted as being due to genes, when in fact it could be the inheritance of lifestyle effects. There's a big difference. And if this is important, it further weakens the case for GWAS and the relentless pursuit of genes as the important factors to understand in major public health research.

Monday, March 22, 2010

Cervical cancer prediction even when a true cause is known

The problem with identifying cause, in this case of a disease, often seems to be that the cause identified is only a partial predictor of the effect. That is, it is only one of an assembly of other contributors, mainly not measured or not known.

We've mentioned a couple of publicized examples of testing for prostate and breast cancers. The positive PSA or mammogram identifies a possible risk factor, but with low probability that serious outcomes will eventually arise. A new example is testing for HPV (human papilloma virus) as a causal risk factor for cervical cancer in women, as described in a recent news story. HPV seems to be a genuine cause, but the probability that someone infected with HPV (which is common) will eventually get cervical cancer, is small. So is the test worth the cost and the uncertainty of what to do with a positive test?

There is no easy answer to such questions. But one thing that is clear: the question itself is worth taking seriously, as a research problem that is challenging but very important. Such research involves both the actual causal story--the primary science itself, as well as clear policy implications: some way to decide what to do with incomplete understanding of the risk factors at play.

Friday, March 19, 2010

Genetic engineering

We often criticize the excess geneticization of diseases whose main cause is not genetic variation but lifestyle factors of various kinds. But some diseases seem clearly to be genetic, with little environmental input, one or only a few clearly known causal genes. In such cases, genetics is the right approach, and genetics of two kinds--to detect risk factors in genetic counseling for parents planning to have children who know a serious allele is in their family, and to treat the disease when it has arisen.

A good example described in last Sunday's New York Times is epidermolysis bullosa. The disease appears to be due to a defect in a collagen gene that produces structural strength and integrity to the skin. The victims have skin described as being as delicate as butterfly wings, and as a result have very compromised lives.

EB seems to be a perfect target for gene therapy--to replace the gene in the germline of parents, or of fertilized eggs in vitro before reimplantation in the mother, or therapeutically to replace the deficient skin cells with cells competent to produce the right type of collagen. Such efforts are described in the article.

Human beings are very good at engineering, and if science is good at anything, it's technology. EB is one of many problems that seem to be engineering rather than conceptual challenges. Science ought to work, in such cases--even if that doesn't mean it can happen overnight. This kind of challenge is where genetic investment should go, in our view.

For complex diseases that are mainly due to environmental or lifestyle factors, if those were ameliorated by social behavior (like better diet) and other measures (like removal of toxins), then for most diseases what would remain would be the truly genetic instances that would fortunately be rare, and fortunately be engineering challenges.

That doesn't mean it'll be easy. If we're good at engineering and have had thousands of years to practice it, if there's one thing that organisms have evolved over countless more thousands, it's to detect and prevent the outside world from attacking its cells. So these will be battles waged mano a mano at the molecular scale.

Many techniques already exist to replace genes, engineer vectors to put genes into cells, or make microorganisms (or culturable cells) produce a gene product. The best approach, perhaps, is to engineer stem cells from the affected person, redifferentiated to be of the needed tissue type, and then somehow introduce them into the affected tissue. There's a lot of progress along these lines, but only time will tell if this is the best approach. Whatever turns out to be the case, at least these are clear-cut problems for which technological solutions seem at least possible.

Thursday, March 18, 2010

X-rated science and that Something Special that exends life

Now here's the latest science news story, if you're intellectually prepared for its deeply technical nature.* Guess what? A study in the US found that healthy older men want more sex! (This study applies to older people; younger ones don't need the help of scientific research when it comes to between-the-sheets time.) And guess what else? They're getting it, too! Now this hot-item (news) is published in the British Medical Journal. This is a respected journal but the fact that it was published there rather than in some staid American journal undoubtedly reflects how steamingly hot all of this is.

If you're feeling out of sorts or out of shape (or if SHE thinks your shape is rather out-of), then I guess we have to be stunned to learn--this is real science after all!--that you're spending more time reading or watching Competitive Poker than you'd like, and none of it stud.

Now, there may be a good-news/bad-news problem here. The rigorous results of this study show that the more your spice up your life the more life you'll have to spice up. Thus, the BMJ story suggests doctors prescribe the beast with two backs, as Shakespeare put it, to two aspirins and bed rest. The study suggests this will lead to longer life (longer in years, at least) if you go home in the randiest of states and startle your unsuspecting partner:

"Tell me what the doctor said, dear?"

"I'll have to show you. But put down your knitting first, or someone could get hurt."

"Eeeek! Stop that!!"

The point here is not to denigrate the importance of sexual health to general health, or not understand aspects of either. But it is to question whether this kind of study had much chance of saying anything that one did not already basically know or, from a practical point of view, raised any practically useful insights, not just to get correlation coefficients that might be mildly interesting. NIH money supported this study, which used existing questionnaires and telephone surveys, with their well-known reliability and laser-sharp penetration (sorry!) of the truth. It is hard to accept that much hard knowledge could come of yet more federal largesse. Since grants are supposed to be funded based on their priority, we can probably guess what the priorities of the review panel were. Imagining what they've funded probably gave the reviewers years of extra life, and maybe their spouses, too. All of it science in the public interest!

*(We're not making this up! We have seen the actual story because our library subscribes to the online journal, but it's not publicly available yet.)

Wednesday, March 17, 2010

Mooning science


Last year we honored the 150th anniversary of Darwin's Origin of Species (1859), a landmark achievement that changed the direction of science. Technically, we should have celebrated a year earlier, because Darwin and Alfred Wallace deserved joint credit for understanding the nature of evolution in a joint presentation to the Linnean Society in 1858. But never mind, because this year we have a more clearly singular and comparably transformative anniversary to celebrate.

This time it's a 400th anniversary, and the transformation was even more encompassing: it set the stage for all of modern science, even biology.

Do you recognize this picture? It stunned the world!

This was what Galileo saw through his self-made version of a Dutch invention being hawked in Paris as a child's toy: the first telescope. In Galileo's book Sidereus Nuncas (Starry Messenger), he showed that unlike the Truth accepted by everyone from Aristotle's time to Galileo's, the Moon was not a perfect sphere, part of the perfect orbs of the universe that were thought to have been created by God as such. The irregular and imperfect Earth was sinful humans' abode, but the celestial orbs were God's perfection.

Galileo's drawings illustrated clear shadows in the sunlight striking the Moon, proving that the surface was irregular, cratered. He also saw many more stars than had been thought to exist. He realized that the Milky Way was not like a nebulous cloud, but was made of countless distinct stars. And he found the moons of Jupiter, that would prove to be invaluable to navigation during the great age of sail.

Above all, the intuitively God-given universe was not as had been thought since intellectual time immemorial! The Moon looked like the Earth in the sense of having been produced by similar processes.

This and later work of course got Galileo in deep trouble with the Church, and almost got him burned at the stake. But despite insincerely recanting, and eventually dying in a kind of house arrest, Galileo and even the Popes knew that times had changed. Galileo's poking around with a telescope ushered in the Enlightenment, the era--that we're still in--of empirical, measurement and observation based, methodologically anchored ways to understand the world. Skeptics could no longer get away with appealing to the ancient Greek thinkers, nor to Scripture. The world needed to be understood properly in its own terms.

Galileo did much later work, where he essentially first showed the need for repeat observation to 'smooth out' statistical and measurement uncertainty. He dealt with gravity, the principles of motion, momentum, and much else--even the nature of relative motion. His books are enjoyable reading, too (they're in dialog form).

Galileo also began the dependence of science on theory, especially theory that could be tested quantitatively. Darwin was a self-confessed innumerate, but his theory works only because its basis, called population genetics, allows us to relate historical (but unobserved) events to present-day data. No theory (and no scripture) can match its convincing power.

So, this is an anniversary to honor!

Tuesday, March 16, 2010

What? No request for more grant money??

Here's a rare science/medicine story that doesn't end with a plea for more money for further study. Indeed, the answer is clear -- but it has been for decades.

"No quick drug fix for diabetes risk", the BBC story is entitled. (There's a different take on this story in the NYTimes, in which Gina Kolata concentrates on potential harm from the drugs under investigation.) A study published in two parts the New England Journal of Medicine (here and here) of 9300 people defined as 'pre-diabetic' (meaning they are beginning to stop responding to insulin) comparing two medications with a placebo showed no difference in the proportion of either group who went on to develop diabetes and subsequent heart disease. "Researchers said the results showed the only way to ensure future health in people at high risk of diabetes was exercise and a healthy diet." And, rather than asking for more money for an even larger study, they call for an increase in money for preventive care.

Why are they sure that prevention is the best cure? We were interested to note that part of the treatment for people in this study in either the group receiving drugs or the group receiving placebo was 'lifestyle intervention'. That is, they asked all subjects to add a 5% reduction in weight and 150 minutes of exercise per week to their drug regimen. Rigorously done, from what is known from other studies about the significant effects of lifestyle changes on risk of obesity and diabetes, this could have made interpretation of the results of drug intervention difficult. But in fact in this study, subjects were followed up only once a year, and, as the accompanying NEJM editorial suggests, lifestyle changes must have been minimal, since, contrary to expectation, risk of diabetes was not significantly lowered in this study.

Ironically, that inference itself assumes what's being tested -- a real no-no in science!, namely, that lifestyle is the cause. That may be right, but it's not right to assume!

And, indeed, the researchers' conclusion that lifestyle modification was more effective at reducing risk of diabetes and cardiovascular disease than the two tested compounds was based on these other studies, not their own -- something they couldn't have shown, given their study design.

So, in fact what we learn from this large and expensive study are things we already knew. Once or twice, such confirmation is a powerful tool for science. But enough repetition is enough!

It is pretty clear, based on what previous studies have shown, that diabetes and its complications are best prevented by diet and exercise.  But, lifestyle changes are easy to explain but hard to implement. We repeat what a colleague once told us; it might well be cheaper to give everyone at risk of diabetes a personal trainer than to continue doing studies that show, in effect, that everyone should have a personal trainer.

But maybe there's progress -- at least the researchers aren't asking for more money to confirm their results.

**Update:  As we've said before (e.g., here), it's looking as though inflammation may be the underlying process behind more and more chronic diseases.  Today's NYTimes reports that this may be true for type 2 diabetes, as well.  Study subjects taking a cheap generic anti inflammatory related to aspirin were able to lower their blood glucose levels far more than those on placebo.  The idea is that obesity may induce inflammation, which then induces insulin resistance. In this case, however, the researchers conclude that "more research is needed."

Monday, March 15, 2010

Satan's Slaves

A Facebook 'friend' has posted a photo of one of those sign's you see outside of churches -- and now a cause cĂ©lèbre on the web -- saying "A Freethinker is Satan's Slave". This friend is very smart and pretty eccentric, and a bunch of his friends are too, and a lot of them are posting comments, slamming the idea that freethinking is a sin. But we could put what they're doing another way -- falling right into line to agree with each other about freethinking not being a sin. And that they don't believe in sin anyway. The tribe of freethinkers -- whose bible is Atlas Shrugged, and whose religion is atheism.

Ok, that's a bit harsh. This guy and his friends bring you some of the smartest, funniest, quirkiest writing, film, and television you know. But, if you put those details aside, it's possible to view this as one side in the religion culture wars, and as such it is relevant to the kinds of things we try to say here -- and try to stay away from. It's more of the same 'freethinking' ideology that you find all over the web, often in blogs that start out as science, and then get derailed (in our view) by the creation/evolution 'debate' into the fervent defense of atheism. These blogs have lots of followers -- but they are no longer doing science. They're advocating another ideology.

And, seen from an even greater distance, it's further evidence of our tribalism, of our need to conform -- to something. Indeed, Facebook itself is evidence of that -- the great potential of the worldwide web to expose us to different people and thoughts and ways to define ourselves we now use by and large to reinforce what we already think. And on Facebook we all define ourselves according to the dictates of the software; job, schools, religion, marital status, friends -- what else do you need to know?

Of course science is tribal, too. In physics you've got your string theorists and your anti-string theorists, and in biology you've got your evolution by regulatory region and your evolution by coding region people, and in ecology you've got people who believe we're bound for disaster if we don't stop contributing to climate change right now, and people who believe we're ok doing it in baby steps.  These are issues that data should support or not, but often the two sides are looking at the same data and interpreting them differently.  Based on values or preconceived notions or assumptions, so that one's commitment to a point of view ventures beyond the realm of science into ideology.  And an ideology brooks no heresy. The stronger a view is held, especially against other views, the more it becomes hardened and uncompromising.

This is no different when free-thinking no longer allows freedom of thought!

Friday, March 12, 2010

Give them the finger, but not the needle!

The pressure to abandon costly but clearly ineffective biomedical testing is always a battle with vested interests, and shows why science is by no means just a matter adjudicated by the 'facts'. A recent example, that we've remarked on briefly before, is PSA testing to detect prostate cancer in men.

The test itself is mostly painless (a needle-prick for a small blood sample, and a budget-prick to pay the Pharma that makes the test). But it is too non-specific and greatly over-diagnoses prostate cancer. First, because it is not specific to cancer, it gives alarm where there's no need. Second, many if not by far most prostate cancers go away on their own or grow so slowly that something gets the guy before the cancer does. Third, a positive PSA reading leads to invasive and reportedly painful tests that thus have their own costs and risk, including psychological ones as well as risks of impotence or incontinence, and so on.

The alternative test is euphemistically known as the DRE ('digital rectal exam'....or finger up the rear). This can directly detect enlarged prostates (almost universal in aging men) and suggest when some other follow-up test might be called for--often the PSA.

But the most prestigious medical journals are now calling for a stop to routine testing in men over age 50, and even the guy who discovered PSA as a molecule specifically produced by the prostate gland and that circulates in the blood, is calling this a human health disaster, in a NYTimes op-ed piece. But, just as vested interests slammed last fall's recommendation by the US Preventive Services Task Force that mammography be done less frequently than current practice, those who profit from PSA testing, and follow-up tests and treatment, are standing firm in supporting its continued use.

It's always a battle when evidence-based recommendations disagree with vested interests. Health care is big business, so that the bottom-line is always on top, and always must be, as a duty to shareholders, the primary consideration when decisions about practice are made. This is as true when withholding treatment is best for the bottom-line, as for insurance companies (which, of course, is why they cherry pick their customers, refusing to cover people with 'pre-existing conditions' because they might cost money), as when testing enriches those who do the testing, such as doctors with stakes in the testing lab. And, when testing is state-of-the-art, a physician has to be pretty confident when he or she chooses not to order it, as that may risk a law suit.

Pity the poor physicians! Even those trying to be up to date and do their best, without any conflicts of interest with labs or drug companies. Which pain in the butt should get priority--the finger up the back of their patients or the patient's lawyer on their back? For a client--er, we mean a patient--over 50, they have to ask themselves whether or not to order a PSA test. It's very hard and takes the doctors' rather than the patients' guts to say no, and make the gamble that nothing will turn up later. Or to have a positive test and tell the patient just to let things ride. And what medical literature should they take seriously, when they are too busy to read much of it carefully, much less being reasonably expected to be able to actually judge the quality of the research? Should they pay any attention to the drug detail-men, or what they hear at the Pharma's expense at their 'education' conferences in the Caribbean?

This will all get more problematic when their HMO is run by a wonk with an MBA in cost-effective procedure management, and the MD is in the office trying to care for an actual human being (the patient, seen by the boss as a customer number). Who's 'evidence' will be used? Who will decide? Some are actually trying to do this right, but clearly not everyone is.

So when you get told to have your test.....what will you do: refuse to bend to the system, or just bend-over?

Thursday, March 11, 2010

More on bugs

We've written before about how it seems that genomewide association studies are finding more and more genes associated with the immune system. And that's for a wide spectrum of conditions, from schizophrenia to macular degeneration to inflammatory bowel disease, and now, Alzheimer's Disease, according to a story in the NYTimes. These genes still explain only a small fraction of risk, but it is certainly starting to look like a trend.

The new hypothesis [about Alzheimer's] got its start late one Friday evening in the summer of 2007 in a laboratory at Harvard Medical School. The lead researcher, Rudolph E. Tanzi, a neurology professor who is also director of the genetics and aging unit at Massachusetts General Hospital, said he had been looking at a list of genes that seemed to be associated with Alzheimer’s disease.
To his surprise, many looked just like genes associated with the so-called innate immune system, a set of proteins the body uses to fight infections. The system is particularly important in the brain, because antibodies cannot get through the blood-brain barrier, the membrane that protects the brain. When the brain is infected, it relies on the innate immune system to protect it.
And, when researchers exposed the protein that constitutes plaque in the brains of many Alzheimer's patients to microbes, it was a fairly efficient killer. While a good fraction of people with dementia are found not to have these plaques upon autopsy, and many people without dementia do have them, the possibility that the innate immune system might fight infection of the brain as well as kill brain cells seems to be real. But it's a complex trait, and like any other complex trait, may well turn out to have more than one cause, one of which may be an innate immune system doing its job too efficiently.

The innate immune system attacks common features of microbes. That's different from the 'adaptive' immune system, which generates a huge array of different antibody molecules by random rearrangement of chromosomal segments (see Mermaid's Tale for some facts about these different systems). Variation in the latter is generated during your life, and inherited variation is not thought to be very important (since the system generates millions of new antibody configurations during your life). Variation in the innate immune system is inherited and works directly, the way genes are usually thought to work: the protein coded by the genetic variant (allele) that you inherit does its job at finding and poking holes in bacteria (or doesn't, depending on its configuration).

Infectious disease can be a strong effect. Whether or not it is systematic and durable enough to relate to natural selection over many generations, it may be that variation in susceptibility can lead to substantially different risk of disease to persons exposed to a given kind of pathogen. If that's the case, then mapping studies -- like GWAS, comparing cases and controls to find parts of the genome that seem to be related to case status -- might be able to detect the stronger risk-effects of genetic variation related to infectious disease. At least, it seems that for a wide variety of diseases a substantial fraction of GWAS 'hits' involve immune or inflammatory genes.

This could be a misleading surmise on our part. The 'immune' system does all sorts of jobs related to molecular recognition, and may involve a larger fraction of the genome than has been thought, so that its involvement in a given disease may not reflect infection, but some other function. Since as we said above these mapping 'hits' only account for a fraction of the case-control contrast, infection can't be the only causal factor.

Secondly, some studies find immune system 'hits' in diseases expected to involve infection, but not for other diseases (like heart disease) that are not. So not everything need be infectious. On the other hand, even something like heart disease risk can in part be due to infection, and instances and mechanisms are known that do that.

So there are interesting things to be found. If infection does turn out to be important, and if we keep over-using antibiotics, we may face a much more serious threat from that direction than from all the claimed 'genetic' risk that so much money is being spent to find by GWAS, biobanks, and other means. If that is the case, hopefully the research effort can be pried from the genetic vested interests in time to address the real problem before it's too late.

Wednesday, March 10, 2010

The "aquatic ape" and the fate of eccentric hypotheses in science

Holly Dunsworth is a fine and responsible scholar and a highly knowledgeable paleoanthropologist. That's why we are delighted to have her occasional participation as a guest blogger on The Mermaid's Tale. In a recent post, she took issue with the 'aquatic ape' hypothesis, and staunch defenders of that hypothesis took issue back. The many blog hits that generated was nice, and strong debate, kept reasonable, is an essential part of science.

When someone suggests an hypothesis that is eccentric--off-center--to the mainline views of a science at any given time, it is received as a heresy, in just the way a new religious ideology is. Science has to defend its current beliefs if it is to be a coherent way to address questions in nature. In fact, from the point of view of responsible science, the vast majority of fringe hypotheses deservedly stay in the junkyard of wrong ideas. That's why they should be greeted skeptically.

The credibility, whackiness, or seriousness of new hypotheses has to be worked out over time. History is little guide. Wrong ideas, fervidly or even universally held, have had long shelf-lives in science: the four humors, trephanation, miasma. Some have come from the fringe to become mainline at least for a while, only to wither from lack of cogency: alchemy, phrenology, Freudian psychiatry. Others have started on the fringe and deservedly stayed there: astrology, hormesis, homeopathy, even while retaining at least some followers.

Hypotheses can eventually show that they have at least some truth. Acupuncture is an example. And of course a few have been resisted until expressed properly, or until enough really strong evidence develops, at which time they transform a field: evolution and plate tectonics (continental drift) are great examples of this.

Now, while one can never tell which idea will have legitimate staying power, one characteristic of those that come from left field and make it to home plate is that they are eventually supported by sufficiently strong, serious evidence that even skeptical peer reviewers have to begin recognizing. Typically, predictions based on the new theory are born out (as in evolution and plate tectonics).

The aquatic ape hypothesis must currently be viewed as a wild-guess Just-So story that got some attention in the popular press. It was apparently first promulgated in 1940, then pushed hard by Elaine Morgan for decades. The question of whether she is a qualified person to be taken seriously for such views may be part of the skeptical lack of credence given to her ideas by the mainline science club. As with other fringe ideas, the blogosphere is alive with advocates and critics, but in these media-saturated days the amount of attention given to such ideas is not a reflection of their credibility (to wit: creationism, Intelligent Design).

Scientists may be clubby, but we're (mostly) not idiots. Indeed, we all hunger to seize on a captivating idea. After this much time, if an idea deserves to be accepted as science rather than popularized speculation, it legitimately can be expected to have developed strong, consensus-building evidence to support it. Put another way, scientists cannot be faulted for their resistance in the absence of such evidence.

If that evidence ever is produced, it will have to be compellingly consistent with the wealth of evidence on hominid evolution that has accumulated since the hypothesis was first aired, which evidence should clearly and preferentially have supported the hypothesis's predictions. If such evidence does some day arrive, a lot of us will have to eat crow. It won't be the first or last time this has happened to the often self-satisfied mainline science.

On the other hand, when an assertive hypothesis is long maintained in the absence of clear-cut evidence, it can fairly be judged to be show business until it does better. If Morgan and her followers argued too strongly, and hence can't bring themselves to back down, too bad. They will have deserved to stay in the corner they zealously painted themselves into. That's what happened to much more qualified and distinguished scientists such as Linus Pauling (vitamin C miracles), Francis Crick (life here was seeded from outer space) and Peter Duesberg (HIV doesn't cause AIDS).

The burden of proof remains on them in the meanwhile.

The three of us who write this blog want it not to be a rant, but to be thought-provoking. That's why, despite having our own opinions, we generally try to stay out of the fervid creationism or scientific atheism blogogalaxies. We want to be open to new ideas and we certainly think the status quo or stupid antiscientific ideas should be challenged when they stray from what the best evidence shows. But we try to do that responsibly, and to stay within the evidence. In this case, you may or may not accept Holly's take on the aquatic ape hypothesis, but that's what she was doing, too, and she is an expert in her field.

Ken and Anne

Infant brain and equally naive thinking by scientists?

The program for this week, in our favorite radio program, In OurTime on BBCRadio 4 (applauded recently in The New York Times, and The London Review of Books, so clearly we're not alone!), is about the infant brain. Three psychologists discuss the history of modern ideas on how the brain works and develops, as an infant grows towards fully functional status.

The discussion contrasts a complete tabula rasa (blank slate) view that the infant learns everything from experience, to a totally nativist view that everything is built in. The former was advocated by Jean Piaget, the latter by Noam Chomsky. The discussants went over many intriguing experiments that have been done.

Clearly the way the brain actually develops is in the middle somewhere: the brain has regions that are dedicated to or specialize in some function, such as processing retinal images from the eye, or verbal sounds, or smells. Some parts of the brain regulate things like heartbeat and blood pressure, or secrete hormones. Except when adaptively relocating in recovery from injury, these seem at least generally to be in similar places in different individuals. That is a kind of functional hard-wiring, but it's very generic. It is a regionalization of areas that are set up, so to speak, to learn from experience--to learn sounds, language, the way animate and inanimate objects behave, and so on.

In interesting and important ways, this contrasts sharply with the dream of Darwinian psychologists and similar schools of thought, that want to be able to pry (rather pruriently, we might say) into a person's brain and claim the ability to see what they're really like, rather than how they fancy themselves to be.

The need to find a selective or even essentially deterministic explanation for the evolution of everything and anything mental is strong in our current culture, even if it's manifestly naive. If we are hard-wired for anything, overall, it is not to be hard-wired any more than was necessary. Humans are par excellence the learning and assessing organism, not one pre-programmed for our various tasks and traits. Pre-programmed to be able to scope out a new situation and figure out how to respond to it. That's what human beings are.

This is also the most parsimonious (simplest and easiest to explain) view of the human mind, if one feels a need for a selective explanation: if you were too rigidly hard-wired, you got caught by surprise and eaten at an early age! You didn't need countless specific selective adventures to weed out vagueness in any and every aspect of your thought. But it's harder to understand neurologically how unprogramming evolved than the 'genes-for' kind of dream. It also isn't as sexy and media-genic a view. Unfortunately, in our society oversimplified determinism and darwinism seems to be the order of the day.

In addition, the anthropocentric explanations of brain function are undermined by the fact that psychologists have shown that many of our traits are found in all sorts of other species, at least of mammals. Things found in dolphins and hedgehogs do not need a human-specific, much less a recent-evolutionary explanation. Such explanations are redundant, not parsimonious--not good science.

How the ability to recognize simple phonetic (sound) contrasts and the like, that is present in other species, led to our ability to build language, symbolic behavior, and all that goes with it, is the core question. Very interesting, but very hard to answer.

Tuesday, March 9, 2010

Personalized archaeology

A young archaeologist gave a talk in our department last week about his excavation of rock shelters in the north east United States. He tries to deduce from artifacts such things as how much time people spent in each shelter, whether they hunted nearby and what it was they got, whether they returned to the same site year after year, caching hunting implements such as flaked or bola stones there for future use, changes in use of the site over millennia, and so on.  (Image by Larry D. Moore, used under a Creative Commons ShareAlike License.)


Several things were noteworthy, at least to some of the non-archaeologists in the audience. First, he talked about how differences in the construction of projectile points on the east coast versus those found in other sites in the Americas contribute to the idea that North America may not have been peopled entirely by migrants that crossed the Bering Straits, but that some may have come from what is now France. (Called the 'Solutrean hypothesis', after a French archaeological site called Solutré this idea posits hunter-gatherers traveling along an Arctic ice shelf across the Atlantic, either on foot or by canoe. The archaeological and genetic support for this idea is weak to non-existant.)


And second, he contended that his practice of dividing his study sites into smaller sections than the usual (his are 30cm x 30cm x 5cm) yields information at the level of the individual. That is, he believes that by restricting the size of each plot he excavates, and knowing the average human reach radius and so on, he's able to deduce how individuals spent their time in the shelter, shaving the points they'd use to hunt the next day, or cooking the day's catch and so forth.


That could be interesting -- like reconstructing campfire gossip about who was where when, and what they were doing.


But, really, it's hard to figure how it could be much more than that, something that an archaeologist I spoke with after the talk said as well. As a science, archaeology is about synthesizing observations into generalizations, to test or develop theory about human behavior. Just as any science.  This requires many observations, spanning long time periods and many different sites. (Although, certainly, as in any science, some archaeologists are not so interested in generalizing, and campfire gossip is enough.)


Later that day, I happened to be reading an old critique of the Human Genome Project, and I stumbled across the following paragraph:
[The HGP] is a powerful strategem to answer only certain peculiar questions relevant to its narrow purview. In summary, our critique is based on the following assessment: (i) going to the lowest level of organization does not necessarily yield any insight of interest; (ii) reductionist explanation, even when possible, is not cost-effective in terms of effort expended; (iii) mapping is justified, blind sequencing is not; and (iv) the sheer complexity of a system might make reductionist explanation impossible. (Tauber and Sarkar, The ideology of the human genome project, J R Soc Med. 1993 September; 86(9): 537–540.)
This could just as easily be describing the reductionist approach of our young archaeologist -- or indeed reductionism in general. What does it tell you to know that someone sat exactly here in the rock shelter sharpening a point? Or even that he was eating roast rabbit as he did so. (This is assuming, of course, that all the methodological issues that could prevent drawing such conclusions, such as dogs making away with animal bones, or burrowing rodents disturbing the dating information contained in the layering of the artifacts in the soil, and so on, were taken care of.) It certainly can't answer broader questions such as how the Americas were peopled, or how long ago that was.


In its race to reduce normal traits as well as behavior, disease, or risk of disease down to the level of the gene, modern genetics has turned the usual scientific method on its head in some ways, rather like reconstructing the activities of an individual at the campfire. And the torrents of sequence data that have been pouring out of labs around the world have led to 'hypothesis-free' analysis. Researchers now comb the data looking for interesting patterns, or the 'gene for' a trait, rather than for support for an hypothesis.


In this kind of thinking, we are still prisoners of Mendel, reducing our explanations to single genes, rather than accepting what has been known for at least 100 years, that most traits are polygenic, and have a strong environmental contribution. Indeed, even the seminal text on genetics and 'racial hygiene', which helped fuel the eugenic era of Naziism and the Holocaust, first published in 1921 and followed by numerous revisions, Human Heredity by Baur, Fischer and Lenz, explicitly recognizes the role of the environment.


But the seduction of genetic determinism and reductionism remains strong and powerful, in spite of the evidence. 

Monday, March 8, 2010

From the land of Minnehaha.....

Last week I traveled to Minneapolis to meet with and give a talk to a large and simulating group of philosophers and historians of science, including faculty, students, and post-docs in many different primary disciplines. I was embarrassed to learn for the first time that UMN has a long history of being one of the prominent centers for philosophy and history of science, and to meet people whose work I should have known.

This is a group of very thoughtful and knowledgeable people who are interested in diverse subjects, including the degree to which science is driven by theory, method, or other considerations. They meld history with philosophy in interesting ways, and I learned a great amount in just two very busy days.

To ground these disciplines in the primary sciences in which they are interested, departments like Ecology, Evolution, and Behavior have 'embedded' faculty members whose primary appointment is in History. These faculty members have regular interactions with the science faculty members, and there seems to be mutual respect and interest, as the pure-science researchers are led to understand more about the history of their often-unstated ideas, and how they may play into current research.

Most scientists that I've met have little interest in these 'soft' areas, and are obsessed with their next papers or grants. They often denigrate anything other than new methods or means to advance their technical careers. This narrows science and fosters the excess reductionism or dream of simplistic answers to complex questions that we think is a problem in many areas of science these days.

Unlike some anthropologists of science, the people I met in Minnesota are not voyeurs of science. They're not there to see how science is really done in any 'Gotcha!' way, and not to find fault. They're there to learn, and to fit current practice into what they know about the history and philosophy of knowledge in their particular area of interest (I was visiting life-scientists, but I think the same happens there--and perhaps was first to happen there--in regard to physics).

I'm very interested in epistemology, philosophy, and history of science. I'm nothing of an expert in this at the professional level, but I do try to keep up with the most important conceptual issues as they apply to daily practice, theory, and interpretation in science. Whether I do it well, or incompetently, I had a great chance to learn a lot from a group of conegenial professionals. The Minnesota Center for the Philosophy of Science site would be a good place for readers of this post to see what's going on there.

Ken

Friday, March 5, 2010

Low tech rats

In our continuing series on the good use of inexpensive technology, the BBC reports that rats have been trained to find landmines, and are doing so in Mozambique.  They are also being used to detect tuberculosis in the saliva of sick patients, and are being trained for other kinds of disease detection.  This is being done by a company called APOPO -- Detection Rats Technology. 

The Bishop of Tenure

Most Bishops promise that if you behave yourself you'll be handed everlasting bliss. But Amy Bishop didn't seem to agree. She had what appears to have been a mediocre record, at best, but a far from mediocre ego and temper. The synod that judged her work decided that it did not merit passage to academic haven (tenure--a lifetime chair in their department) so she sent them to heaven--by shooting them dead in their chairs.

This highlights one of the issues in American academic life that has many aspects, ethical, practical, and cultural, and that also affects science itself. Tenure is awarded if you have done important work, and enough of it (and often if you also hustled a lot of money to support it, and the overhead charges of your institution) that your peers within your department, and externally in the profession, judge you to be a Major Player. Even at the University of Alabama/Huntsville which, no matter its strengths, isn't a nationally top school.

The tenure system leads to all sorts of gaming of the system, as people very naturally try to build a case that they should not (ever) lose their jobs. Some people think nobody should have a lifetime job (unless you own the company?). Others defend the system because it truthfully protects freedom of speech in universities, where intimidation of ideas should not happen (but clearly would without a tenure system). And many like the system because we're beneficiaries. It's definitely a privileged position to be in. Unfortunately, in our security hungry class it becomes treated as something like a civil right, identical to one's ego and sense of self-worth, and so on.

It isn't however, something to kill for!

The Bishop murders (well, we weren't witnesses, so we are reporting what we've seen in the papers) show many aspects of our polarized, tightly wound-up society. They show the nature of academic bureaucratic structures. And they show a lot of pretense and loss of mission, too.

Universities used to be thought of as 'schools'. The strange notion was that we were here to teach students so they could be equipped to go out in the world and be successful. Instead, much of what happens at universities neglects the main students (the undergraduates) and instead is structured increasingly by the faculty to serve the faculty.

The point of the age-old publish-or-perish ethos used to be not score-counting, but that faculty members do enough work of good enough quality to show that they were keeping up to date in their field, so that they could be capable teachers. It was not a pretentious self-congratulatory game. Of course there were always those top-level faculty who published well and often, but that was the exception, not the ritualized necessity.

It isn't however, something to kill for!

Besides the stress on the junior faculty member, the system leads to the kind of gaming by which people split up papers into many rather than fewer, worry about the 'impact factor' of the journals they publish in, do whatever kind of work is fundable and publishable rapidly. This is part and parcel of our too-often incremental 'safe' kind of research. Problems that take a long time to address well are not on the plate. Most of us know this, though not all would acknowledge it (we've posted on this subject before).

Stress over tenure decisions spoils a good bit of university life. We have to have standards and, just like a baseball team, not every one who aspires to a position can have one (much less a lifetime one). But good training of students (not just 'our' own students who man the grant projects in our own labs), and our best effort at understanding the truth of our respective fields of work, not paper-counts or grant budgets, should be the marks of success.

Now the Alabama murders make one raise questions about the tenure system, and they're legitimate questions. But this can mistakenly be interpreted that the reason for such examination is that the system is badly flawed. That somehow Bishop's mayhem was understandable, or even excusable. A cause for self-examination where the questioning finger is pointed at ourselves.

It isn't however, something to kill for!

From the stories we see, Dr Bishop snapped under the pressure. There is no excuse or sympathy for that. She's certainly not the kind of Bishop the Church of Knowledge can tolerate. Her peers were doing their job, and that job involves human judgment. That means to some extent it is frail, fallible, and perhaps even sociopolitical. But that's no secret. The system cannot be allowed to be intimidated by those who snap. Even if it leads us to reflect that perhaps we should return the academic altar of knowledge to some saner semblance of balance.

Thursday, March 4, 2010

The evolution of infection, continued

We posted a few days ago about the proliferation of the way we're causing the evolution of genetically altered pathogens that are resistant to antibiotics. That's because we slam them with exposure to lethal agents, rapidly removing all but the few, or very, very few bacteria (or viral particles) that are resistant, leaving them an open field for competition-free access to resources and hence rapid reproduction.

We have an 'adaptive' immune system that generates clones of white blood cells or antibody molecules they produce. Because of a particular way in which the chromosome region that codes for these antibodies is scrambled to produce the antibody protein, we generate millions of random antibody structures. This is thought to be a way of avoiding the need to evolve pathogen-specific antibodies. If we make countless different antibody molecules, the odds are that at least one will be able to grab onto some part of any bacterial cell or virus that we may chance upon. We don't have to know what it will be ahead of time. Once we recognize it, the lineages of white cells that recognize it are induced to proliferate. They can destroy the pathogen, or recognize and kill cells that have been exposed to it.

Indeed, this molecular random-scrambling defense has found a match in some parasites who have many different versions of cell-surface proteins that they need for their life-cycle, and the idea (that we have about it) is that by the time our immune system has recognized the pathogen's characteristic, their cells switch to a different surface protein. We might kill those presenting the former protein but the new-presenters will have a chance to survive. This would lead to survival of these bugs, but at the same time our immune system will be able at least to contain the infection. Plants use somewhat similar switching strategies to fend off infection.

Such cat and mouse strategies should work. We should be able to detect and get rid of anything that may invade us that can be recognized and dealt with on a cellular level. So why then do we ever get sick? This is an interesting question.

Some pathogens are larger than a single cell or hide themselves in various ways. The malarial parasite gets into our red blood cells, and apparently is not vulnerable to our adaptive immune system. So we have evolved other kinds of defenses, such as red cells that resist invasion by the parasite. Of course, it doesn't always work.

In the end we need to play an extinction game: develop a means of assault at something so fundamental to a pathogen's lifestyle that it can't out evolve us. This is clearly possible in principle: most species that have lived on earth have become extinct. In each case the cause is different. But it should be possible, and some diseases like polio and small pox have been pushed close to the edge of extinction.

On the other hand, the more we present a ready target for rapidly evolving parasites, the faster we'll have to develop strategies to combat them.

Still, it would seem that many kinds of bacteria would never be able to develop characteristics we can't recognize. They may develop resistance to antibiotic attacks on some aspect of their biology, but why can't our immune system always eventually produce an effective antibody? Different diseases will provide different answers, and in most cases the answer is not known. Certainly we don't know the answers. But the evolutionary story is that the rapidly evolving simpler pathogens continually challenge our rapidly changing array of immune attacks.

Wednesday, March 3, 2010

Culture, evolution, eugenics, and the nature of people

There was a story in the NYTimes yesterday that reports on work showing that humans have evolved in the context of culture. The story reports culture as an 'evolutionary force' in human evolution. Reference is made to a recent review of this subject in Nature Reviews Genetics.

Humans have evolved in the context of culture -- material, psychological, social, and symbolic. That has of course affected our genes. The ability of adults to digest the milk sugar lactose in populations that have depended on dairying for thousands of years, or of malaria resistance in people whose cleared agricultural fields have provided breeding grounds for mosquitoes are well-known examples. But the phenomenon is much more than that, and indeed is pervasive in humans.

Many aspects of culture protect us from what would otherwise be physical weakness that selection might otherwise have weeded out of our population. Eyeglasses, medicine, social care for the infirm, weapons, social hunting and harvesting and defense, clothing and fire, and the enabling power of language are examples.

Darwin and others worried that the protections of society would lead to the pollution of our gene pool. This was one of the motivating factors of the eugenic movement, as we discussed in a recent post. That movement aimed to remove the 'unfit' (in the Darwinian sense, but often equated to what scientists felt were undesirable in their local social sense).

This is all true and the paper referred to in the news story is a good and informative one that shows how genetic technology has made it possible to find some specific responsible genes, even if one can debate the strength of evidence for some of its examples. But as so often happens with media reports, there is a problem, and the story is misleading in that it seems to suggest that the molding effects of culture are recent (say, post-agricultural), or enumerable (only a few traits molded that way), or that this is a new discovery.

Instead, there is nothing whatever new about this except the identification of specific genetic examples. Once again, it's misleading media-hyperbole. A better understanding of human nature, and an antidote to eugenic-like thinking, or ethnocentrism with the problems that causes, would result from the realization that is not at all new to anthropology, that humans have always been, from the beginning of our divergence from common ancestry with chimps and other apes, the cultural species. From upright posture, to opposable thumbs, language, hairlessness, our physical helplessness relative to other species (no claws, fangs, wings, etc.), and so much else, this has always molded our way of life. As CL Brace, a leading anthropologist put it way back around 1970, culture was humans' ecological niche -- culture is why are here. Indeed that was offered as the reason there is only one human species here today--what was called the competitive exclusion principle in population ecology: only one species can occupy any given ecological niche.

Yesterday there was also a BBC story about the ancient nature of human culture, showing that at the time we were emerging as our current species, as long as 100,000 years ago, even art was already with us.

This fact was routinely known to any anthropologist paying attention, perhaps back to the 19th century when the field more or less became a professional one. And it's not exactly that new, either. To pick but one example, we have two recent posts on Ibn Khaldun who recognized our adaptation to environment and culture in the 13th century. Why can't we learn to be more aware of history and less melodramatic about our own time? Maybe it's in our nature to take our own lives and times too seriously -- maybe it gives our lives meaning. Maybe it's just to sell magazines and promote careers.

The protective effects of culture are vital to human existence and evolutionarily adaptive. And they always have been.