This is our second post on the question of whether or when knowledge is good, and when or whether some kinds of knowledge should be legitimately suppressed. In the past, we've posted our view that if some people choose not to believe in evolution, but cling to religious explanations of life, that is to be resisted.
Should everyone be forced to recognize what science shows are the realities of life--that, for example, regardless of what we don't yet know about the details of the process, it seems unambiguously clear that life has evolved from common origins on earth? If religious people are taxpayers, and their reality is belief-based rather than empirical, should they have a right to keep their money from being used to disseminate ideas that they feel are sinful? Or, just philosophically, is life really better when based on what we today consider true knowledge, rather than on other kinds of stories that may be less empirical but more comforting?
In medical circles, it has seemed unambiguous that knowledge is always good--a lifesaver or at least life-improver. But several examples have shown that this is not so clear. To know or not to know is a more legitimate question that is widely perceived.
This post is occasioned by another story, perhaps the most definitive to date, about whether PSA (prostate specific antigen) screening is worth doing to detect early prostate cancer in men. The US Preventive Services Task Force now recommends that healthy men not have PSAs at all. The same body recommended in 2009 that women undergo far less mammographic screening to detect breast cancer, and the story is similar to challenges to the net value or risk of vaccines, which we've blogged about before.
The new story is a summary analysis of the literature by a government panel, that argues that PSA tests are not definitive. Various conditions that are totally benign can lead to high PSA levels, while tumors may or may not be detectable. High levels lead to further testing, some of which is quite invasive and has negative health or lifestyle consequences (impotence, incontinence). And a high fraction of prostate tumors are slow growing, regress on their own, or arise in men who are old enough that they're likely to die of something else before the cancer would become dangerous.
On the whole, the argument is that not screening will lead to less overall negative health consequences than screening. While some tumors that would have been serious may be detected by screening, the cost of not having that knowledge is less than the cost of having the 'knowledge' of the PSA test. This is interesting, because the test itself (PSA level in the blood) is basically true knowledge--that is, except for occasional lab mess-ups, the level is correctly being estimated.
But the problem is that this is not by itself 'knowledge' about prostate disease, much less cancer. The correlation between PSA levels and dangerous cancer is far too indirect. So, one response is to keep testing and do much more intensive studies of the course of things, maybe by intentionally not treating men with high PSAs while treating others. But we already know that the latter leads to substantial negative life-experience. So is that a correct approach?
Instead, one might dump the PSA testing (except where someone is known to be at high risk, such as because of susceptible genotype), and say that we need some other approach to screen for or treat prostate cancer. Or maybe it is actually 'knowledge' that most tumors are slow-growing and will not kill. Maybe on the whole, not knowing by PSA testing is a better kind of 'knowing' -- knowing that on the whole, not testing leads to lower risk.
We're not acculturated in science to ignore facts. As we blog frequently about, we often ignore facts if they are inconvenient (as in GWAS and other similar mega-approaches to complex disease), imposing a kind of ignorance upon ourselves---in the name of science! There are many other ways in which science is ostrich-like, either for this reason, or because of politics (as in covering up of tobacco risks in past decades).
Robert Proctor has coined the term 'agnotology' to refer to ways in which science or society intentionally or otherwise remains un-knowing (agnostic).
It is far from clear that science tells us how the world 'really' is, rather than how our best understanding, based on current approaches and value systems, tells us it is. Other worldviews satisfy some human needs much in the way science does. And history shows that current scientific 'truth' will later be shown to be wrong, often fundamentally wrong.
In the case of balance of risks, as in PSA testing, we see other ways in which intentionally choosing not to know is a meaningful reflection of what we do know. That itself is a kind of knowledge. Subtle issues pervade even science's ideology that knowledge is always good.
No comments:
Post a Comment