Thursday, October 4, 2018

Processed meat? Really? How to process epidemiological news

So this week's Big Story in health is that processed meat is a risk for breast cancer.  A study has been published that finds it it must be true, right?  After all, it's on CNN and in some research report.  Well, read even CNN's headliner story and you'll see the caveats, the admissions, softened of course, that the excess risk isn't that great, but, at least, that the past studies have been 'inconsistent'.
Yummy poison!!  source: from the web, at
Of course, with this sort of 'research' the weak associations with some named risk factors can easily be correlated with who knows how many other behavioral or other factors, and even if researchers tried to winnow them out, it is obvious that it's a guessing game.  Too many aspects of our lives are unreported, unknown, or correlated.  This is why week after week, it seems, do-this or don't-do-that stories hit the headlines.  If you believe them, well, I guess you should stop eating bacon.....until next week when some story will say that bacon prevents some disease or other.

Why breast cancer, by the way?  Why not intestinal or many other cancers?  Why, if even the current story refers to past results as being 'inconsistent' do we assume this one's right and they, or some of them, were wrong?  Could it be that this is because investigators want attention, journalists need news stories, and so on?

Why, by the way, is it always things that are actually pleasurable to eat that end up in these stories?  Why is it never cauliflower, or rhubarb, or squash?  Why coffee and not hibiscus tea?  Could western notions of sin have anything to do with the design of the studies themselves?

But what about, say, protective effects?
Of course, the headlines are always about the  nasty diseases to which anything fun, like a juicy bacon sandwich, not to mention alcohol, coffee, cookies, and so on seems to condemn us.  This makes for 'news', even if the past studies have been 'inconsistent' and therefore (it seems) we can believe this new one.

However, maybe eating bacon sandwiches has beneficial effects that don't make the headlines.  Maybe they protect us from hives, antisocial or even criminal behavior, raise our IQ, or get fewer toothaches.  Who could look for all those things, when they're busy trying to find bad things that bacon sandwiches cause?  Have investigators of this sort of behavioral exposure asked whether bacon and, say, beer raise job performance, add to longevity, or (heavens!) improve one's sex life?  Are these studies, essentially, about bad outcomes from things we enjoy?  Is that, in fact, a subtle, indirect effect of the Protestant ethic or something like that?  Of the urge to find bad things in these studies because they're paid for by NIH and done by people in medical schools?

The serious question
There are the pragmatic, self-interested aspects to these stories, and indeed even to the publication of the papers in proper journals.  If they disagree with previous work on the purportedly same subject, they get new headlines, when they should perhaps not be published without explicitly addressing the disagreement in real detail, as the main point of the work--rather than the subtle implication that now, finally, these new authors have got it right.  Or at least, they should not headline their findings.  Or something!

Instead, news sells, and thus we build a legacy of yes/yes/no/maybe/no/yes! studies.  These may generally be ignored by our baconophilic society, or they could make lots of people switch to spinach sandwiches, or many other kinds of effects.  This latter is somewhat akin to the quantum mechanical notion that measurement gives only incomplete information but affects what's being measured.

Epidemiological studies of this sort have been funded, at large expense, for decades now, and if there is anything consistent about them, it's that they are not consistent.  There must be a reason!  Is it really that the previous studies weren't well done?  Is it that if you fish for enough items, you'll catch something--big questionnaire studies looking at too many things?  Is it changing behaviors in ways not being identified by the studies?

Or, perchance, is it that these investigators need projects to get funded?  This sort of yo-yo result is very, very common.  There must be some explanation, and that inconsistency itself is likely as fundamental and important as any given study's findings.  Maybe bacon-burgers only are bad for you in some cultural environments, and these change in unmeasured ways, and that varying results are not 'inconsistent' at all--maybe it's the expectation that there's one relevant truth, so that inconsistency suggests problems in study design.  Maybe the problem is in simplistic thinking about risks.

Where do cynical possibilities meet serious epistemological ones, and how do we tell?


marMALAGA said...

The problem is in correlation without causation being enough for making "science".

Ellen said...

"Maybe they protect us from hives"

Be right back, gotta go to the store .😉

Ken Weiss said...

The problem with _your_ problem is that scientists don't realize that. And, to me, we are applying modes of inference borrowed from physics and chemistry, where things are much more truly replicable than they are in life, but that's too much to go into here.