Tuesday, January 8, 2013

Piltdown--and modern evolutionary frauds?

We are celebrating the 100th anniversary of the famous Piltdown fraud in biological anthropology.  One might think some other term, like 'commemorating, might be more apt for a fraud. But in a sense the lesson-learned from this event at least should be celebrated.  Or has the lesson been learnt after all?

Piltdown, England: 1912
In 1912, a lawyer and amateur antuquarian named Charles Dawson reported finding some fossil bones that resembled something like what people thought the famous 'missing link' that would connect humans evolutionarily to other apes should look like.  Dawson found this material in a gravel pit, near the town of Piltdown in the south of England, when he saw the material eroding out of the soil of a construction site (top image, Dawson seated; below, the crew of specialists examining the finds).  He reported this excitedly, because the search had long been on to find the missing link and demonstrate that, indeed, Darwin's idea that all life, including humans, descended from a common origin on Earth.  The story and  relevant details are in Wikipedia: Piltdown man.



Piltdown gang (Wikipedia)

Digging systematically in the site quickly occurred, to see what else might be there, and much was found that in a sense 'completed' a picture of the past.  Britain's leading anatomist, Sir Arthur Smith Woodward, was called in to examine the findings. The French paleontologist (and Jesuit priest) Teilhard de Chardin came to see the site.  The findings were sensationalized because not only did they provide the anticipated evidence, but they showed that England was the site of the oldest human fossil--a matter of national pride, even though Darwin had quite correctly reasoned that ultimately we originated in Africa, where our closest living relatives were to be found.

'Anticipated' is an appropriate word here, because our narcissistic pride had convinced anthropologists that our brain had led the adaptive way out of the mediocrity of other apes.

The findings
The Piltdown findings included bones of other animals (prey, probably), and stone tools of various sorts, all colored similarly to the mineral soil, were quickly found.  But the 'human' bones took center (or, rather, centre) stage: they included a piece of a chimp-like jaw, with some teeth whose cusps had been worn rather flat as might be found in humans, and some fragments of cranial plates--the parts encasing the brain. 


Reconstructions seemed to be able to make the jaw and head parts fit together.  In the above image the darker bits were what were found, the rest a reconstruction made of the likely bearer of those bits.The stunning fact was that this individual had a chimp-like jaw but a large brain, and used tools.  What this showed, or proved to those who analyzed the material, was that humans had evolved first by adapting a large brain and intelligence, and this was followed by humanizing the lower, less important body structures.

Now, this was satisfying, but also very curious, because while it suited our preferred story of our adaptive Darwinian origins, it didn't gibe with any of the other fossils that had yet been found. So they were in a sense suspect, at least in the sense that other representatives of the missing link surely would be found sometime.  But they weren't.  Other fossils that seemed to show that in fact our body 'humanized' by becoming upright and attaining some of our dental characteristics before our brain enlarged, such as a famous skull found in Africa, were dismissed or ignored.

It took over 40 years before careful examination of the Fossil that Changed the World showed that it was, indeed, a total fraud.  The bones were recent, not fossil.  The material had been intentionally, but only superficially stained to resemble the gravel in the site.  The jaw was a recent ape jaw, and the skull parts from a not very old human. Teeth had been filed and altered.

Debate raged for many years about who had perpetrated this hoax.  Smith Woodward, Chardin, and Dawson were the leading suspects.  Eventually, it was shown that many other remarkable finds that Dawson had made were fakes; Dawson wanted to be included in the august British science community, and was doing what he thought it took to get there (he died shortly after the original 'finds,' and no further finds were made after his death).  We now know with extensive evidence from many parts of the world that, in contrast to Piltdown, we evolved body first and then our brains expanded.

The lesson
So why had the hoax had held sway for so long a time? The answer is at least in part, that the find fit our expectations, the model that anthropologists had in their heads about how we 'must' have evolved. It was a kind of egocentric view of what makes us human, what distinguishes us and hence what we must, in a sense, have evolved 'for':  thinking advanced thoughts (such as science!).

Only when other evidence began accumulating was the diligence invested in seeing if Piltdown's strange configuration really was true.  Then, the fraud was quickly explored.  The lesson was that science, like religion or other areas of human endeavor, is a social system with the kinds of layered beliefs, commitments, and agreed-on lore that any social system has.  It is not just about objective fact.  In this case, something that was too good to be true wasn't true, but the fact that the leaders of the field had for a half-century been duped showed something serious about the depth to which preconceptions control even science.

Has the lesson been learnt?  Genetics and evolutionary biology today
Have we--evolutionary biologists and anthropologists alike--as a group actually learned the Piltdown lesson, a century on?  Certainly technologies make it much more difficult to perpetrate a blatant fraud of this sort.  But from time to time we do see prominent frauds of comparable seriousness being accepted until being discovered, often by luck (recently, for example, the Korean geneticist's faked cloning of a human).  There are others in the literature, and of course these include only the ones that have been outed. How much remains undiscovered is unknowable, and especially in this era when so much detailed technology (such as whole genome sequencing on lots of individuals, or exotic transgenic experiments) is involved, it is very hard to confirm a study. Likewise, the proliferation of journals worldwide stresses the review process and inundates us with things not checkable in any practical sense.  Unless it is very fundamental, we tend to accept a finding and build our work around the results.  So in evolutionary, genetic, and developmental biology, even today we may be at least a bit too ready to accept, or unwilling or unable to test, claims that are made--if they fit our expectations or, one should say, preconceptions and assumptions.

In this there's a more subtle, but perhaps much more important sense in which the Piltdown lesson has not been learned.  So committed are we to Darwinism, that we indulge ourselves very deeply in a commitment to a particular view of evolution and genetics, in which causation is quite deterministic and, especially, in which everything that can't be nailed down is assumed to have an explanation that involves adaptive natural selection--almost as if it were directed, force-like, toward an end.  The media--popular as well as scientific--are riddled daily with uncritical, glib, Just-So stories about how and why things evolved for a particular reason, and these are swallowed by scientists and public alike more readily than the whale swallowed Jonah.

Everywhere you turn you can find such tales, and the web only makes them more numerous and more likely to go viral, and become a sort of accepted truth.  The drug of popularity and credit that the web provides encourages this.  Skeptics of such tales are usually ignored, but even junk stories that are widely rejected don't slow down the same people from inventing more of them.  Since we can look at what we see today and invent multiple possible stories that would be consistent with the data, even a total adaptationist should be skeptical and circumspect about any one of them.  But such skepticism is rare, and disagreement is itself often not sufficiently circumspect, but instead consists of the skeptic's alternative Just-So story.  In a subtle and perhaps less blatant way, one Piltdown is replaced with another.

What is fraud?
A fraud is an intentional or knowing misrepresentation.  Whether such stories as are told today are harmful in a practical sense is debatable.  Are they 'frauds'?  Does the occasional brief caveat, such as saying the [trait] 'might' have been evolved for such-and-such, exculpate the author if buried amidst overstatement?  Well, such stories reflect what must be judged as culpable credulousness if they're  based on assumptions extended far beyond what we know very well that we can actually test or verify--and are thus told with a sense of certainty comparably far from what we actually know is justified.  They are misleading because they are glibly told and the proof is how easily they are often accepted.  They are stories told, often knowingly, in a way that (as with Dawson) may generate a lot of attention for those who propose them.

Such stories harm the search for truth.  Science needs working frameworks, but they should not be cages.  In medical, societal, political, and other arenas, pat stories like the genes 'for' or adapted 'for' stories--affect policy in ways that can't be guaranteed to be good.  A century of eugenics based on essentially the same kinds and quality of reasoning shows that clearly.

The way to celebrate Dawson's fraud, is to recognize the role of belief even in science, and to stop committing new instances of our own.

Monday, January 7, 2013

Who gains when art/literature/science is hyped?

What is art worth? And who decides?
An interesting exchange of letters appeared yesterday in the New York Times Sunday Review.  "Sunday Dialogue: What is That Art Worth?"  Why is art selling at such high prices today?  Is it 'good' art, or is it hype?  And, who benefits?

William Cole, whose byline says he is writing a book on art connoisseurship, begins the debate:
Financiers know the value of hype. They understand that if artworks sell at exorbitant prices, those works — and the artists who created them — become newsworthy, regardless of whether they’re actually any good. And the media play right along, almost never questioning the quality of the works or the abilities of the artists.
Historically, time has been very cruel to the reputations of all but the very best artists. But today enormous sums of money are devoted to propping up the star power of a small group of wildly overrated cult figures. Have the masters of the universe created the unpoppable bubble? Or will the child’s voice somehow manage to rise above the buzz and proclaim that the emperor is, as many have suspected all along, buck naked?
The first reply takes Cole to task for misunderstanding why art has become so expensive.  It's that it's no longer single masters sitting alone in their drafty lofts creating their masterpieces.  Instead, art has become a huge industry, each stage, from artist's workshop to the gallery, demanding huge amounts of money, thus ratcheting up the cost.  Thus, the cost is understandable, and justified.

The next reply takes Cole to task for ignoring the 2% theory -- only 2% of artists of any age are actually any good, and today is no different from any other time.  If bad artists are being promoted, it's nothing new.

Several correspondents take issue with Cole's choice of bad artists, who deserves idolatry and who does not.  Caveat emptor and all that.

Donald Waits, a former public school teacher of art-history, writes:
It is obvious to me, as a subscriber to several art blogs and arts in education Web sites, that a new specialty in promoting and influencing the sales of minimally valuable “art” is being developed as a serious career in the business of art. The courses being touted by universities and art schools describe, in clear language, how to advocate for mediocrity in the art market. A whole new language is being taught to critics and art “scholars” for the sole purpose of pushing higher and higher prices for art that ultimately has little value.
Next, Michelle Marder Kamhi, art reviewer, writes:
Mr. Cole is absolutely right. Today’s art market has nothing to do with art and everything to do with hype, and the media are deplorably complicit in the game.
Another correspondent points out that the art market is not an academic enterprise, it's a business, and the idea of who deservers to be "in" at any given time changes, not according to any objective truth, but in response to market forces.

Cole replies that the argument that big projects require big price tags, regardless of the quality of the results, should be used by Congress next time they want to vote themselves a raise.  He also notes:
With strategic donations, rich collectors and dealers gain influence in major art museums. Whose works do you suppose they encourage those museums to purchase and exhibit? Whose works do they donate, sometimes retaining a life interest, for huge tax deductions calculated according to market prices, however ridiculous those prices might be? 
So while only the few can afford to actually hang such pictures on their walls, the rest of us taxpayers get to help pay for them. Who said the art market wasn’t democratic?
It's not just art
By now it's surely clear to you why this exchange interested us.  "Wildly overrated cult figures," "the rest of us taxpayers get to help pay," "a new speciality in promoting and influencing the sales of minimally valuable "art", "not an academic exercise but a business," and so on.  Take just a step back and it's clear that what's happening in the art world -- huge prices for art of questionable, or at the very least debatable, value -- is happening in many sectors of society, particularly those in which quality is a subjective judgement, and where megabucks are there for the taking, including banking, publishing, industrial agriculture, and of course, science.

Substituting, say, GWAS or Mars-life, or Higgs Boson, etc. for "art" in this exchange of letters is not too much of a stretch.  And, as with art, there are arguments over who the bad guys are, and whether there even are bad guys, whether it's hype or truth, but there can be no argument over the amount of money the taxpayer is putting up to make some people well-off and famous, scientists and providers of the technology both.  On the promise that this is all an investment in the future.

A common argument, especially by those doing well in such a system, is that it is the incentive that drives success, that success is rare but it's just costly if you want to nurture that 2%, and that's our way of doing business.  From this point of view, there's nothing to complain about, unless perhaps it's how your taxes are used if you don't happen to be a beneficiary, or whether that 2% really is that valuable compared to what else might be achieved at more modest cost.

Then, of course, one can argue anthropologically, that this kind of resource hierarchy-building is how our society works, so that any alternative approach to art or science that one might dream up, even if it could be applied against resistance from those currently 'in', would quickly lead to much the same as we see across such a wide spectrum of societies today.

Friday, January 4, 2013

Weighing in on a Weighty subject

Finally, definitive proof!
So, the latest (hottest, and certainly this time just must be true) report is that obesity (that is, Body Mass Index, or weight-for-height) isn't so clearly damaging to health and disease as billions of dollars and millions of pages of punditry and scientific hyperbole have suggested over a mere fifty years.  Whoopie!  We can eat again!  What a relief! 

Or is it?

The study we posted on yesterday seemed to say that.  But even forgetting our usual (of course always cogent and well-placed) reservations about science news bulletins, perhaps there is something else to note, that might cause at least a few milliseconds of thoughtful contemplation.

If this is a causal, material world, then as the argument goes, everything must be understandable  and predictable strictly in terms of molecules and energy--because that's all there is!  And since, the argument continues, evolution has molded life around DNA as the primary causal molecule, we simply must be the product of, and hence predictable from, our genes.

The BMI study was not a genetic report, and only concerned the predictive power of the net measure, BMI.  It was about the long-assumed health risks, or not, of obesity.  But there is a bit of slippage here:  BMI is easy to measure (your weight related to how tall you are), and so many different studies can collect comparable data, etc.  It is thus a convenient measure of choice for obesity.

Taking the current dogma of our time that everything simply must, obviously, necessarily be 'genetic', many studies you've paid for with your taxes have naturally done their best to find the genes 'for' this important health-risk trait.

No, not at all
Thus a major and very large GWAS on the genetics of BMI was published a couple of years ago (Nature Genetics, Nov. 2010).  This study of a mere 250,000 individuals found a small number of modest (statistically 'significant') locations in the genome, including one confirmatory gene (called FTO) that a blind person could find without using his hands.  Other 'known' obesity risk-factor genes weren't in this list, and of course there is the plethora of excuses--er, that is, alternative explanations--for why these genes didn't show up in the hit-list.

Now that is mysterious enough (unless you've been thinking critically about genetic causation, its evolutionary history, and the nature of such studies), but at least it's a large study that should illuminate at least the nub of the causal truth.

However, also in Nature Genetics in Nov. 2010 was another obesity GWAS.  This time the measure used was not BMI, but the waist-to-hip ratio (WHR).  This is another convenient, non-invasive, and cheaply measured index of obesity.  The study was the pooling of 61 studies of a total of a mere 114,000 participants.  Now this study essentially found no overlap in genome region 'hits' with the BMI study!  It also failed to show several genes well known to relate to obesity and related dieases, from many actually focused studies including mouse experimental work.

One can rationalize all one wants about this 'discrepancy' (to use a kind word for it).  But if 'obesity' is a meaningful trait with any sort of unitary causal nature, then measuring it in two ways should generate essentially the same result, after accounting for statistical vagaries.  Just as using a metric (Celsius) thermometer won't tell you anything more about water, ice, and steam than using a Fahrenheit scale.

237 traits linked to genomic loci by 1449 GWAS studies
(Source: www.genome.gov/GWAStudies); 2012
Clear and devastating indictment of the state-of-the-art
This issues seems not at all to have been noticed (openly, at least) by anybody. Instead, it should be seen as a clear and devastating indictment of the GWAS and related 'omics' grand-sample, meta-analysis, quick-and-dirty enterprise that we are investing so heavily and mechanically in.  It should be the miner's canary, telling us clearly that we are not going about this in a right way.

We can't blithely accept the current BMI and health finding as related to obesity in an interpretable way and are rather forced to recognize, as we said in our prior post, that BMI is a stand-in for some confounding factor(s) that may or may not have been measured.  That's because if different genes predict one measure of 'obesity' compared to another measure, there must be some seriously complex or heterogeneous causal variation in our data that we are not measuring, may not know about, but are not highly correlated with each other or, at least, are not consistently correlated with different ways we choose to define something as a trait, or risk factor.

'Obesity' is in some ways an obvious trait in its extremes (from skinny to very over-weight), and body weight is clearly related to health measures of various kinds.  But the Omics Way that is being taken is falling short, and this also means that the Epidemiological Way, of parsing a large plateful of variables into this or that correlation coefficient with various statistical significance levels, is also badly wanting.

We don't have the answers.  Indeed, the problem is not just that nobody has the answers, it's that the only reaction is to claim we need more and more, larger and larger, studies of essentially the same sort to get the answers!  But, bigger isn't always better!

Thursday, January 3, 2013

55th CoE

The 55th Carnival of Evolution, 2013's first, is up over at Genome Engineering.  Resolve to read it.


Should you or shouldn't you lose weight?

So, you've just resolved to lose weight this year, when up pops a story that makes you think twice.  Indeed, the headline in the New York Times -- "Study Suggests Lower Mortality Risk for People Deemed to be Overweight" -- might well lead you to open another bag of chips.  But, while it literally represents the study's findings, published in the Jan 2 issue of the Journal of the American Medical Association (JAMA), it is a bit misleading, as it might be construed to mean that risk is higher for people who are "normal" weight, and lower for everyone else.  But that's not what the study found. 

The researchers did a meta-analysis of 100 studies of the risk of mortality associated with BMI, or body mass index, an indicator of how much fat a person is carrying independent of his or her height.  A 'meta-analysis' pools results from many different studies, each of which may be too small in size to be definitive, in order to have larger samples and gain greater statistical power.  In this case, the total included about 3 million people.  Of course, one must assume that the different study samples are similar enough in their risk characteristics for this kind of pooling to be legitimate.

Be that as it may, the authors in this case found that subjects with intermediate BMI's, who are considered overweight but not grossly obese, with a BMI between 30 and 35, had lower risk of all-cause mortality than those with "normal" BMI of 25-30, or grades 2 and 3 obesity, BMI over 35.  According to the JAMA press release about the study:
The researchers found that the summary HRs indicated a 6 percent lower risk of death for overweight; a 18 percent higher risk of death for obesity (all grades); a 5 percent lower risk of death for grade 1 obesity; and a 29 percent increased risk of death for grades 2 and 3 obesity.
Why? Well, the researchers have numerous possible explanations, again as quoted in the press release: 
“Possible explanations have included earlier presentation of heavier patients, greater likelihood of receiving optimal medical treatment, cardioprotective metabolic effects of increased body fat, and benefits of higher metabolic reserves.”
That is, in part, BMI is likely to be a confounder, standing in for ('correlated with') other unmeasured variables that are the actual reasons that mortality is lower among overweight and moderately obese people, such as that they go to the doctor more and receive more treatment than thinner people.

And, the story in the NYT quotes physicians saying that it's not BMI that counts anyway, but other health indicators such as cholesterol levels, blood pressure or blood glucose levels.  If those are normal, a person doesn't need to worry about losing weight for the sake of their health -- but, they are more likely to be elevated in people who are overweight.  Illness can cause weight loss, too, so the population of people with "normal" BMI may well include some who are seriously ill.  And, the location of the fat, whether in the belly or superficial, seems to be important too, and it varies from person to person.

The researchers do suggest that body fat might have protective benefits as well, and that may be so. But, the upshot of this study seems to us to be that the association of body weight with mortality risk is not straightforward. It's possible that in some people body fat is even protective, and more is better -- up to a point. And, well, if you know all sorts of other things, including whether the person already has symptoms or problems, then BMI becomes a useful predictive measure.  In other words....despite expensive and extensive decades of research and countless news stories, and countless counselors, fad diets, exercise and reducing programs, and all the magazine and infomercial pressures on body weight, well, we're not so far from square one.  Except, is it reasonable to say, for a lot of researchers who have made a lot of hay out of this for decades?

In the end, the decision as to whether or not you keep your resolution to lose weight should not be decided on the basis of this one study.  We'll have more to say about this tomorrow.

Wednesday, January 2, 2013

The state of malaria…

This last month the World Health Organization (WHO) released a new World Malaria Report.  It is certainly worth mentioning some of the main points (peppered, of course, with a little of my opinion and critical observations).

The last half decade has seen a fairly drastic increase in funding directed towards eliminating or controlling malaria.  Furthermore, according to WHO numbers, that funding has also been effective.  In Sub-Saharan Africa, where malaria causes the greatest morbidity and mortality, the number of insecticide treated bednets available to households has gone up.  And if the surveys are to be trusted, it also looks as though people who have access to these bednets are using them as bed nets.  (This is always a problem as bednets can also be used for other great things such as fishing nets, table cloths, and I’ve even seen them stored outside so that large edible crickets can be caught and sold in the market place in Northern Thailand).
A creative use of bednets in Northern Thailand.  To the right there is a pole with a black light and umbrella tied to the top.  Large crickets are drawn to the light and the umbrella directs them downwards toward the bednets.  The Karen villagers then collect the insects and sell them for around 1 baht each (about 30 cents U.S.D) at the market.  

Other positive notes include an increased rate of diagnosis prior to administering antimalarials and the wide-spread adoption of ACTs (artemisinin combination therapies).  ACTs are used rather than monotherapies with the hope that widespread resistance to artemisinin won’t emerge globally.  On a negative note, it already has emerged in Southeast Asia and has been documented in four nations to date.  A real concern is that artemisinin resistant parasites will find their way to Africa, perhaps following the same routes that chloroquine resistant parasites spread globally from Southeast Asia decades ago (presumably migrating through human reservoirs).

A not-so-surprising component of the report concerns where the actual improvements occur.  Places with comparatively mild endemic malaria were the quickest at improving the situation.  (It is easier to fix a small problem than to fix a really big problem).  However, the greatest improvements in mortality and morbidity appear to have occurred in the places with the worst malaria burden.  This makes some sense.  If the money and resources were spread completely evenly across all malaria ridden nations (and such spatial homogeneity is almost never the case for anything) then we would find exactly such a finding, just because there are more sick and dying people in the nations where the malaria burden is the worst.

But now for some sobering considerations that even the WHO brings up.  Those places where malaria is the worst are also the places with the worst malaria surveillance.  That means that estimates from those places have the biggest confidence intervals.  (Which means that estimates must be taken not only with my critical interpretation but also with more than a few grains of salt).  Furthermore, it looks like the rate of increase of both funding and the rolling out of malaria-related services have reached plateaus.  The overarching implication here is that things have been getting better, and they could continue to get better, but the future is very uncertain.  Things could get worse again.

(I truly hope that things are getting much better and that they will continue to do so.  Certainly in some parts of Southeast Asia where I work things have gotten better.  They’ve probably gotten worse in other places and will continue to do so if other socio-political problems don’t cease (see my previous blog post on Northern Myanmar).  I tend to be a real pessimist when it comes to malaria eradication.  I’m completely hopeful that it will happen, but I have serious doubts that it will ever happen for most places or that it will remain that eradicated in places as the years go by.  At this point, history supports my pessimism.)

This would probably be a good place to mention the mini academic spat that recently occurred over yearly malaria mortality estimates.  Murray et al. estimated that around 1,238,000 people died from malaria in 2010.  Given historical, long term trends this wouldn’t be that surprising except that the WHO’s annual report from 2011 estimated that things were getting much better and that around 655,000 people had died from malaria in 2010 [1–3].  Several scientists were quite bent out of shape over this discrepancy.  Don’t get me wrong, a discrepancy of about half a million isn’t trivial.  It just sometimes feels like the ‘right’ numbers (numbers that serve a purpose) can be more important than numbers that are more accurate scientifically.  I’ve seen very similar arguments with regard to whether or not we should be talking about artemisinin resistance yet, since the drug is so important and there isn’t really widespread failure [4, 5].  I’ll not tread further into this debate.  

But public/global health is part science and part marketing.  It can be a tricky thing to try to convince people to throw money at problems that don’t appear to directly affect them.  Numbers like what the WHO are throwing out could actually do it (and perhaps have been doing it in the last half decade).  That doesn’t mean that I’m saying the WHO organization numbers are wrong.  What I am saying is that it is a dramatic story that could pull on the heartstrings of wealthy donors.  I favor science over marketing schemes, but, regardless of the validity of the WHO’s findings, I hope it does inspire people to continue funding global health efforts.  A healthy world will be better for all of us.    


References

1. Murray C, Rosenfeld L, Lim S, Andrews K, Foreman K, Haring D, Fullman N, Naghavi M, Lozano R, Lopez A: Global malaria mortality between 1980 and 2010: a systematic analysis. The Lancet 2012, 379:413–431.
2. WHO WHO: Malaria. Fact Sheet 2010, 2010.
3. Ye Y, Kyobutungi C, Ogutu B, Villegas L, Diallo D, Tinto H, Oduro A, Sankoh O: Malaria mortality estimates: need for agreeable approach. Tropical medicine & international health : TM & IH 2012, 00:10–12.
4. Meshnick S: Perspective: artemisinin-resistant malaria and the wolf. The American journal of tropical medicine and hygiene 2012, 87:783–4.
5. White NJ: Counter perspective: artemisinin resistance: facts, fears, and fables. The American journal of tropical medicine and hygiene 2012, 87:785. 

Tuesday, January 1, 2013

Wish for the New Year -- not the bad luck of '13'

The new year starts with the bad-luck '13' as part of its identity.  The way this year is starting in the US, with disgustingly divided government, largely about the preservation of deep inequality and debt-building promises-to-everyone-about-everything to get elected, is not exactly rosy.


One thing in the offing seems to be some sort of budget-cutting that relates to science research.  We wonder if there is any hope that the government departments that give out the funds will take a serious look at priorities, rather than allowing vested interests to prevail.  That might take more guts  even than Congress has, if you can believe it.

How and where can we cut?
We've got a system of bloated costs, over-kill technology, and incremental rat-raced industrial research that keeps labs going whether or not there is much promise of real progress for peoples' lives or even for profound new basic understanding.  Keeping labs in business is an understandable issue, but partly that's because of the way university administrations have evolved and the kind of careerism that evolution has built into the system.

Smaller funding, but with more long-term security, and some fair-minded accountability to prevent dead-wood waste, could stimulate more cogent thinking, by more people, who taught more students rather than just writing grants  and publications, because more investigators could spend more time at the difficult research tasks.  Less pressure to pour out 'results' the way Scott's pours out paper towels, less frenzied proliferation of rapid-publication journals, and so on.  This could engender a system that allows or even stimulates innovative basic science more than what we have now.

Meanwhile, for more costly science, and especially for NIH related research, diversion of funds from the big comprehensive 'omics' bin to focused research on real problems, would be a proper response.  One can go agency to agency and see how much nearly useless incremental (but very costly) research is being supported, both intra- and extramurally.

In our area of work, genetics, NIH should focus heavily on disorders that we know are genetic in every sense of the term, when we know the gene involved.  Show that genetic knowledge can actually pay for the research that led to it, by doing something about those disorders.  Otherwise, pull back from mega-genetics and put resources where there is something more relevant to understand.

The nation faces big fish to fry--many major science-related problems--when it comes to research, but our tendency is to put aside big pots of funds for such things, and this just leads all the research livestock to mob to the trough, much of the projects of no real central relevance; and once the funds are committed to the institute, goal, or whatever, a big bureaucracy sees to it that the funds are spent, not to mention funding maintained as far into perpetuity as possible.

Instead, we should reward bureaucrats who find ways to spend less and return funds to the central kitty for more properly targeted research.  At the local university level, deans should be rewarded only if they can show that in major ways they have reduced the administrative overload, made their own unit's offices smaller and its activities fewer, and pulled back from their bean-counting, computer-based means of evaluating and advancing faculty--to get away from the overdrive system we have now.

Investigators should be forbidden to have more than some small number of grants (maybe no more than 2?) or amount of funding, so they can spend their time doing actual work rather than just writing grant applications, managing projects, and telling graduate students what to do for their dissertations that will support the investigator's agenda.

Measures of these kinds won't happen, not in the real world of '13', and that means it is hard to predict how or how fairly budget cuts will affect the research establishment.  But we can envision better ways to do things, leading to better ways to live, and to increased probability that we'll see more in the way of substantial new knowledge, and perhaps even some dramatic use of genetic knowledge to relieve people of challenges to daily life that really, truly, are genetic.

Cuts are never fun, not in our society.  But we may face some constraints, and if we think about the responses carefully, maybe 13 won't be so unlucky after all....