Showing posts with label grants. Show all posts
Showing posts with label grants. Show all posts

Wednesday, April 12, 2017

Reforming research funding and universities

Any aspect of society needs to be examined on a continual basis to see how it could be improved.  University research, such as that which depends on grants from the National Institutes of Health, is one area that needs reform. It has gradually become an enormous, money-directed, and largely self-serving industry, and its need for external grant funding turns science into a factory-like industry, which undermines what science should be about, advancing knowledge for the benefit of society.  

The Trump policy, if there is one, is unclear, as with much of what he says on the spur of the moment. He's threatened to reduce the NIH budget, but he's also said to favor an increase, so it's hard to know whether this represents whims du jour or policy.  But regardless of what comes from on high, it is clear to many of us with experience in the system that health and other science research has become very costly relative to its promise and too largely mechanical rather than inspired.

For these reasons, it is worth considering what reforms could be taken--knowing that changing the direction of a dependency behemoth like NIH research funding has to be slow because too many people's self-interests will be threatened--if we were to deliver in a more targeted and cost-efficient way on what researchers promise.  Here's a list of some changes that are long overdue.  In what follows, I have a few FYI asides for readers who are unfamiliar with the issues.

1.  Reduce grant overhead amounts
FYI:  Federal grants come with direct and indirect costs.  Direct costs pay the research staff, the supplies and equipment, travel and collecting data and so on.  Indirect costs are worked out for each university, and are awarded on top of the direct costs--and given to the university administrators.  If I get $100,000 on a grant, my university will get $50,000 or more, sometimes even more than $100K.  Their claim to this money is that they have to provide the labs, libraries, electricity, water, administrative support and so on, for the project, and that without the project they'd not have these expenses. Indeed, an indicator of the fat that is in overhead is that as an 'incentive' or 'reward', some overhead is returned as extra cash to the investigator who generated it.]

University administrations have notoriously been ballooning.  Administrators and their often fancy offices depend on individual grant overhead, which naturally puts intense pressure on faculty members to 'deliver'.  Educational institutions should be lean and efficient. Universities should pay for their own buildings and libraries and pare back bureaucracy. Some combination of state support, donations, and bloc grants could be developed to cover infrastructure, if not tied to individual projects or investigators' grants. 

2.  No faculty salaries on grants
FYI:  Federal grants, from NIH at least, allow faculty investigators' salaries to be paid from grant funds.  That means that in many health-science universities, the university itself is paying only a fraction, often tiny and perhaps sometimes none, of their faculty's salaries.  Faculty without salary-paying grants will be paid some fraction of their purported salaries and often for a limited time only.  And salaries generate overhead, so they're now well paid: higher pay, higher overhead for administrators!  Duh, a no-brainer!]

Universities should pay their faculty's salaries from their own resources.   Originally, grant reimbursement for faculty investigators' salaries were, in my understanding, paid on grants so the University could hire temporary faculty to do the PI's teaching and administrative obligations while s/he was doing the research.  Otherwise, if they're already paid to do research, what's the need? Faculty salaries paid on grants should only be allowed to be used in this way, not just as a source of cash.  Faculty should not be paid on soft money, because the need to hustle one's salary steadily is an obvious corrupting force on scientific originality and creativity. 

3.  Limit on how much external funding any faculty member or lab could have
There is far too much reward for empire-builders. Some do, or at least started out doing, really good work, but that's not always the case and diminishing returns for expanding cost is typical.  One consequence is that new faculty are getting reduced teaching and administrative duties so they can (must!) write grant applications. Research empires are typically too large to be effective and often have absentee PIs off hustling, and are under pressure to keep the factory running.  That understandably generates intense pressure to play it safe (though claiming to be innovative); but good science is not a predictable factory product. 

4.  A unified national health database
We need health care reform, and if we had a single national health database it would reduce medical costs and could be anonymized so research could be done, by any qualified person, without additional grants.  One can question the research value of such huge databases, as is true even of the current ad hoc database systems we pay for, but they would at least be cost-effective.

5. Temper the growth ethic 
We are over-producing PhDs, and this is largely to satisfy the game of the current faculty by which status is gained by large labs.  There are too many graduate students and post-docs for the long-term job market.  This is taking a heavy personal toll on aspiring scientists.  Meanwhile, there is inertia at the top, where we have been prevented from imposing mandatory retirement ages.  Amicably changing this system will be hard and will require creative thinking; but it won't be as cruel as the system we have now.

6. An end to deceptive publication characteristics  
We routinely see papers listing more authors than there are residents in the NY phone book.  This is pure careerism in our factory-production mode.  As once was the standard, every author should in principle be able to explain his/her paper on short notice.  I've heard 15 minutes. Those who helped on a paper such as by providing some DNA samples, should be acknowledged, but not listed as authors. Dividing papers into least-publishable-units isn't new, but with the proliferation of journals, it's out of hand.  Limiting CV lengths (and not including grants on them) when it comes to promotion and tenure could focus researchers' attention on doing what's really important rather than chaff-building.  Chairs and Deans would have to recognize this, and move away from safe but gameable bean-counting.  

FYI: We've moved towards judging people internally, and sometimes externally in grant applications, on the quantity of their publications rather than the quality, or on supposedly 'objective' (computer-tallied) citation counts.  This is play-it-safe bureaucracy and obviously encourages CV padding, which is reinforced by the proliferation of for-profit publishing.  Of course some people are both highly successful in the real scientific sense of making a major discovery, as well as in publishing their work.  But it is naive not to realize that many, often the big players grant-wise, manipulate any counting-based system.  For example, they can cite their own work in ways that increase the 'citation count' that Deans see.  Papers with very many authors also lead to red-claiming that is highly exaggerated relative to the actual scientific contribution.  Scientists quickly learn how to manipulate such 'objective' evaluation systems.] 

7.  No more too-big-and-too-long-to-kill projects
The Manhattan Project and many others taught us that if we propose huge, open-ended projects we can have funding for life.  That's what the 'omics era and other epidemiological projects reflect today.  But projects that are so big they become politically invulnerable rarely continue to deliver the goods.  Of course, the PIs, the founders and subsequent generations, naturally cry that stopping their important project after having invested so much money will be wasteful!  But it's not as wasteful as continuing to invest in diminishing returns.  Project duration should be limited and known to all from the beginning.

8.  A re-recognition that science addressing focal questions is the best science
Really good science is risky because serious new findings can't be ordered up like hamburgers at McD's.  We have to allow scientists to try things.  Most ideas won't go anywhere.  But we don't have to allow open-ended 'projects' to scale up interminably as has been the case in the 'Big Data' era, where despite often-forced claims and PR spin, most of those projects don't go very far, either, though by their size alone they generate a blizzard of results. 

9. Stopping rules need to be in place  
For many multi-year or large-scale projects, an honest assessment part-way through would show that the original question or hypothesis was wrong or won't be answered.  Such a project (and its funds) should have to be ended when it is clear that its promise will not be met.  It should be a credit to an investigator who acknowledges that an idea just isn't working out, and those who don't should be barred for some years from further federal funding.  This is not a radical new idea: it is precedented in the drug trial area, and we should do the same in research.  

It should be routine for universities to provide continuity funding for productive investigators so they don't have to cling to go-nowhere projects. Faculty investigators should always have an operating budget so that they can do research without an active external grant.  Right now, they have to piggy-back their next idea by using funds in their current grant, and without internal continuity funding, this is naturally leads to safe 'fundable'  projects, rather than really innovative ones.  The reality is that truly innovative projects typically are not funded, because it's easy for grant review panels to fault-find and move on the safer proposals.

10. Research funding should not be a university welfare program
Universities are important to society and need support.  Universities as well as scientists become entrenched.  It's natural.  But society deserves something for its funding generosity, and one of the facts of funding life could be that funds move.  Scientists shouldn't have a lock on funding any more than anybody else. Universities should be structured so they are not addicted to external funding on grants. Will this threaten jobs?  Most people in society have to deal with that, and scientists are generally very skilled people, so if one area of research shrinks others will expand.

11.  Rein in costly science publishing
Science publishing has become what one might call a greedy racket.  There are far too many journals, rushing out half-way reviewed papers for pay-as-you-go authors.  Papers are typically paid for on grant budgets (though one can ask how often young investigators shell out their own personal money to keep their careers).  Profiteering journals are proliferating to serve the CV-padding hyper-hasty bean-counting science industry that we have established.  Yet the vast majority of papers have basically no impact.  That money should go to actual research.

12.  Other ways to trim budgets without harming the science 
Budgets could be trimmed in many other ways, too:  no buying journal subscriptions on a grant (universities have subscriptions), less travel to meetings (we have Skype and Hangout!), shared costly equipment rather than a sequencer in every lab.  Grants should be smaller but of longer duration, so investigators can spend their time on research rather than hustling new grants. Junk the use of 'impact' factors and other bean-counting ways of judging faculty.  It had a point once--to reduce discrimination and be more objective, but it's long been strategized and manipulated, substituting quantity for quality.  Better evaluation means are needed.  

These suggestions are perhaps rather radical, but to the extent that they can somehow be implemented, it would have to be done humanely.  After all, people playing the game today are only doing what they were taught they must do.  Real reform is hard because science is now an entrenched part of society.  Nonetheless, a fair-minded (but determined!) phase-out of the abuses that have gradually developed would be good for science, and hence for the society that pays for it.

***NOTES:  As this was being edited, NY state has apparently just made its universities tuition-free for those whose families are not wealthy.  If true, what a step back towards sanity and public good!  The more states can get off the grant and other grant and strings-attached private donation hooks, the more independent they should be able to be.

Also, the Apr 12 Wall St Journal has a story (paywall, unless you search for it on Twitter) showing the faults of an over-stressed health research system, including some of the points made here.  The article points out problems of non-replicability and other technical mistakes that are characteristic of our heavily over-burdened system.  But it doesn't go after the System as such, the bureaucracy and wastefulness and the pressure for 'big data' studies rather than focused research, and the need to be hasty and 'productive' in order to survive.

Sunday, June 28, 2009

Can the grant system be overhauled?

Gina Kolata writes in today's New York Times that $105 billion dollars have been spent on cancer research since 1971 when Richard Nixon declared 'war on cancer', but, measuring progress in death rates, we're not a whole lot better off now than we were then. Indeed, cancer is good business, and cancer specialty clinics are opening or expanding all over the country (and advertising about how good they are, to promote sales). Cancer treatments are long and costly to our health care system, so it's a serious problem, both economically and for anyone who has cancer.

Ms Kolata correctly attributes much of this to the conservative nature of the grant system. Researchers don't even apply for money for new ideas because they know the system doesn't reward innovation, so what does get funded is research that doesn't even attempt to go beyond incremental progress. And researchers' careers, prestige, even salaries depend on grants.

Kolata's article is specifically about cancer, but the conservative nature of the grant system is true for all fields. It's partly because 'peer review'--the judging of grants by people who do similar work--keeps it that way. People can only evaluate what they already know. And it's partly because the system demands that the researcher demonstrate that the work can be done, which requires pilot data. And as with any large establishment, it learns how to protect and perpetuate its own interests.

It is not easy to say what to do about it. What kind of accountability for grant recipients would be appropriate? The research questions being asked are tough, so 'cures' cannot be promised in advance, and the more basic the research the less clear what the criteria for success could be. The idea of accountability is that if your research is paid by a health institute it should make notable contributions to health, not just journal articles or to the researcher's career. A candid observer could quickly eliminate a high fraction of grant applications on the grounds that, even if successful as promised, their contribution would be very minor, as Kolata illustrates. Perhaps there should be a penalty for making promises that aren't kept--at least, that could help make the system more honest.

Limits on the size or length of projects or of an investigator's total grants would help spread funds around. But what about the role, sometimes legitimate and sometimes mainly based on a love of technology, of very expensive equipment and approaches? Is there a way to identify less technically flashy, but perhaps more efficacious work? It's easy to see that this can be true: lifestyle changes could prevent vastly more cancer than, say, identifying genetic susceptibility causes, yet we spend much money on cancer genetics research compared to environmental change.

Speaking of lifestyles, one cannot order up innovations the way one can order a burger with fries. Might there be 'meta' approaches that would increase the odds that someone, somewhere will make a key finding or have a penetrating idea? Would that more likely come from someone in a big lab working on long-term projects, or someone in a small lab working in relative obscurity?

Or is it OK to perpetuate the system, assuming good will come of it here and there, meanwhile a lot of people are employed to manage big labs, run the experiments, collect data, make machinery and lab equipment, and sweep the floors in large lab buildings?

These reflections apply to much that is happening in the life (and other) sciences today. They drive the system in particular directions, including fads and technologically rather than conceptually based approaches, and in that sense some things are studied while other approaches may not be considered (or funded because they're out of the mainstream). An example relevant to our blog and work is the way that genetic determinism and, more broadly, a genome-centered focus, drives so much of life and health sciences. By no means irrelevant or all bad! But it is a gravitational force that pulls resources away from other areas that might be equally important.

Clearly major findings are generated by this way of doing science, even if false promises go unsanctioned (indeed, those making them usually continue to do that and continue to be funded with major grants). Life sciences certainly do increase our knowledge, in many clearly important ways. Yet disease rates are not dropping in proportionate ways relative to grandiose promises.

Is there a solution to all this? Could the system be dramatically overhauled, with, say, research money being parceled out equally to anyone a university has deemed worthy of employment? Could the peer review system be changed, so that some non-experts are on review panels, ensuring that the system doesn't simply perpetuate the insider network? Or would they not know enough to act independently? Universities encourage and reward grant success not because it allows important work to be done by their brilliant professors but because it brings prestige, score-counting, and 'overhead' money to campus--can university dependence on overhead or faculty's on salary be lessened? Is there a way to encourage and reward innovation?

Saturday, April 4, 2009

Credible research

Marion Nestle, Professor of Nutrition and Food Studies at NYU, was on campus last week to speak, sponsored by the Penn State Rock Ethics Institute. Nestle is the author of a number of popular books about the politics of food, and an outspoken critic of the influence of the food industry on how and what we eat, and thus, on the health of the American population. She's particularly concerned with obesity in children and the role of advertizing in promoting the consumption of excess calories even in children as young as two. She believes that any money researchers take from the food industry is tainted money. Her point is that it's impossible for a scientist to do unbiased research, however well-intentioned, if the money comes from a funder that stands to gain from the findings. Indeed, it has been found that results are significantly more likely to favor the funder when research is paid for by industry.

The same can and has been said about the pharmaceutical industry and drug research, of course, and, though we don't know the particulars, it has to be equally true of chemistry or rehab or finance or fashion design. But, as we hope our posts about lobbying last week make clear, the problem of potentially tainted research doesn't start and stop with the involvement of money from industry. Research done with public money can be just as indebted to vested interests, its credibility equally as questionable. It can be somewhat different because researchers tend not to feel indebted to the actual source of the money -- the taxpayer -- but research done on the public dollar can be just as likely to confirm the idea or approach the funding agency supports.

Even when money isn't the motivation, there are many reasons that research might not be free from bias -- the rush to publish, the desire to be promoted or get a pay raise, commitment to given results, prior assumptions, unwillingness to be shown wrong. Many prominent journals won't publish negative results and of course journals and the media like to tout if not exaggerate positive findings. There is pressure to make positive findings -- and quickly -- to use to get one's next grant (and salary). This is one reason it is commonly said that one applies for funds to do what's already been done. This makes science very conservative and incremental when careers literally depend on the march of funding, no matter what their source.

Besides the pressure to conform and play it safe, a serious problem is that such bias doesn't necessarily make the science wrong, but it does make it more difficult to know how or where it's most accurate and worthy. And it can stifle innovative, truly creative thinking. Some of the most important results are likely to be negative results, because they can tell us what isn't true or important, and guide us to what is. But that isn't necessarily what sponsors, especially corporate sponsors, want, and it isn't what journals are likely to publish.

So, while it's essential, as Marion Nestle and others consistently point out, to eliminate the taint of vested interest from research, it's impossible to rid research of all possible sources of bias. And the reality is, at least for our current time, that it's only the fringe of those most secure in their jobs etc., who can speak out about the issues (as Nestle said, she has tenure and doesn't need money to do her work, so she can say anything she wants to) -- and they do not have the leverage to change the biases built into our bottom-line, market- and career-driven system.