Showing posts with label knowledge. Show all posts
Showing posts with label knowledge. Show all posts

Friday, June 27, 2014

Who has all the answers?

We write about a lot of things on Mermaid's Tale.  All of us who contribute to this blog try to be thought-provoking in interesting and hopefully non-standard ways.  We don't have all the answers, and indeed we tend to write about things for which nobody has the answers.

We try to see the various (often more than the proverbial two) sides or facets to important problems in genetics, epidemiology, and evolution--and its philosophical and societal aspects.  What is clear is that many aspects of these areas, not least being the search for or even the notion of causation, are elusive and that in many ways our science in these areas has only the crudest forms of theoretical understanding.  Or, perhaps, we need some theory that differs from the physical sciences, in which the same things--electrons, oxygen molecules, gravitation attractions--are either completely replicable or deterministic.

In this context, we do use our modest forum to criticize the widespread claiming of having answers, and to examine why such claims are usually, like reports of Mark Twain's death, greatly exaggerated. Our society tends to reward those who claim and proclaim their work, and we try to temper that.

We try to be polite in what we write, and hope we don't go over the Snark line too often.  Sometimes a nerve (or an ideology) will be hit, and in our experience the response can be vehement to say the least, because our society currently doesn't give much respect to decorum. The uninhibited nature of the web brings out the worst in people, sometimes.  Our society, our academics, selfish aspects of capitalism, adversarial advertising culture, competitive careerism and our built-in systems of advocacy, too often don't lend themselves to better discussions.  In our lifetime it certainly has become less self-restrained, though perhaps every generation thinks the past was better than the present.

Nonetheless, there is far more that we (people in general, and scientists in particular) do not know than we care to acknowledge, and this leads to competition for attention and resources that we think should be resisted.  Criticism if properly placed can at least hope to nudge things in a better direction. Acceptance is a form of acquiescence.

Unfortunately, often neither we nor other critics have magic answers and "tell us what to do instead" is the retort of last resort from those who agree with the problems but want, conveniently, to carry on until someone instructs them to do otherwise--provides some new fad by which grants can be sought, and so on. That's not how science should be done.  If there are problems, and they are generally acknowledged, those who agree with the issues, and who are working in the area, should pause and try themselves to figure out truly better ways.  Isn't that supposed to be the job of science?  Unfortunately, doing that involves thinking, and the demands of writing grant applications for doing more of the same doesn't always leave time for that.

In addition, we think that searching for strange or perplexing results, even amidst claims of understanding, is a way we can try to stimulate more creative work, to the extent that anyone's listening.

In particular, it is young people, who want a career in science, philosophy, or other thinking professions, who need to take the baton and run with it.  That's the only way that change will happen.  We think this starts with asking "If the standard explanations aren't really right, what might be right instead?  If what is alleged to be true is not true, could the opposite be true in some way instead?"

In an often notoriously stiflingly institutional environment of universities, every stimulus to creative thought is, we think, worth the effort.  Even if we ourselves personally never know any answers to anything.  If young people don't try to cut through the spider web of institutional inertia, they will be the spider's next meal--a 50-year, career-long meal.

Tuesday, March 4, 2014

Lucretius, and stories about the Nature of Things

I never took courses in Classics, so my knowledge of the major figures and their works is only sometimes from my own informal reading, and mainly from secondary summaries or hearsay.

But, after having read of and about it for many years, I decided I should actually read Lucretius' (99-55BC) long poem entitled De Rerum Natura (On the Nature of Things).  So many times, I've heard that Lucretius anticipated modern science in sometimes rather eerily prescient ways, and I felt I should see for myself.  I can't read Latin, so I have read an English prose translation in English, hoping at least to see the content, even if unable to appreciate the artistry.  


The deep truths?  Source: crusader.bac.edu



Lucretius
Lucretius was a Roman, nominally writing to persuade a friend of his, but he was basically expounding the views of the Greek Epicurean school of philosophy, led by Epicurius and Democritus and others who lived a couple of hundred years earlier. Among other things, the Epicureans argued persuasively that there were no gods and that human existence is matter-to-matter, dust-to-dust: enjoy life while you have it because that's all there is.  You will decay and your material will return to the cosmos from whence it came.  Nothingness cannot generate material, and material is eternal.  Your mind (or spirit, or whatever term you use) is part of your body and disappears with your body, into the cosmos, to be re-used in the future, as it had come to form you when you were conceived.

There exists matter, they said, made of atoms ('a-tom', or cannot cut), which are the smallest fundamental particles, and 'void' (space).  Atoms move around randomly through space and collide when they meet, at which time they can stick and form larger constructs, which today we call  'molecules'.  Aggregates of these compounds make up the stuff of the world.  It can separate and rejoin, but matter is neither created out of nothing nor destroyed, nor is space.  To allow for our apparent free will, the Epicureans said that atoms could sometimes 'swerve' from their normally determined paths.

The atomic theory of the Epicureans sounds quite like modern molecular and atomic physics and cosmology (though it is true that modern physics does seem to allow strange things like the creation of matter and energy from cosmic vacuum, and perhaps multiple universes, and so on).   Thus, the ideas Lucretius described can seem presciently scientific, two thousand years before their time.  I have read such characterizations frequently.   But there are some interesting points here, that have to do with how ideas are anchored in reality, and with selective reporting.

For one thing, if you read the rest of Lucretius, you'll find stories of the origins of the things in an earth-centered universe, including anthropological tales explaining the origin of humans and their cultural evolution--how we started out crude and beast-like, then discovered weapons, clothing, governments, language and song, the discovery of agriculture, domestication of animals.  He also used his theory to explain the nature of lightning, earthquakes, volcanoes, weather, geology,  gravity,why the Nile floods and the nature of magnetism.  He explained the working of our senses like vision, touch and taste, in atomic terms--accounting, for example, for the emanations from the atoms on the surface of our bodies, that enable us to see 'ourselves' in mirrors.  He raised developmental arguments to show that chimeric beasts, like Centaurs, cannot be real.  He delves into racial variation and why different populations are subject to different diseases.  And he goes into the clinical nature and epidemiology of plagues.

A main aim of Lucretius was to purge people of superstition.  He fervently wanted to dismantle anything other than a pure materialism, even in explaining the origin of moral aspects of society.    In this sense, too, he is favorably cited for his 'prescient' materialistic atomic theory of everything.

In the major sections of De Rerum, however, the apparent prescience becomes less and less, and any idea that he foreshadowed modern science dissolves.  Basically, the Epicureans were applying their notion of common-sense reasoning based on very general observations.  They strung out one categorical assertion after another of what 'must' be the case.  In today's parlance, they were providing hand-waving 'explanations' ('accounts' would be a better term) that seemed consistent but did not require any sort of rigorous means of establishing truth.

Along comes the Enlightenment
Aristotle, Plato, and others of the Greek philosophers reinforced the idea that reasoning itself was enough to generate understanding of the world.  We are basically built, they said, to see and understand truth.  Such a view of knowledge lasted until about 400 years ago, the period called the Enlightenment (in Europe), the time of Francis Bacon, Descartes, Galileo, Newton, and many others.  Those authors asserted that, to the contrary, to understand Nature one had to make systematic observations, and develop proper, formal, systematic reasoning to fit hypotheses to those observations, to develop general theory or laws of the world.  Out of this was born the 'scientific method' and the idea that truth was to be understood by empiricism and actual testing of ideas, not just story-telling--and no mysticism.

Reading Lucretius makes one realize first, that even if a story like the Epicureans' atomic theory has aspects we'd regard today as truth, it was to them basically a sort of guessing.  Secondly, just because a story is plausible does not give it a necessary connection to truth, no matter how consistent the story may seem.  We now do have actual 'scientific' theories to account for--or, now, actually, explain--phenomena such as earthquakes, weather, volcanoes, the nature of metals and water, the diversity of life, a great deal of biology, and even culture history.  If you think of how we know these things, even if there are major gaps in that knowledge, you can see how very powerful and correct (or at least much more accurate) a systematic approach to knowledge can be, when the subject is amenable to such study.

It is a great credit to centuries of insightful, diligent scientists, our forebears, whose legacy has brought us to this point.  It is a wonderful gift from them to our own time.

Advances in technology and methods may be making some Enlightenment concepts obsolete, and we continually find new ways of knowing that go ever farther beyond our own personal biological senses.  For those of us in science, reading the likes of Lucretius is an occasion to compare then and now, to see why just being intelligent and able to construct consistent explanations is not enough, and that for many areas we do now have ways to gain knowledge that has a firmer footing in reality--not just plausibility.


But....
That's all to the good, but if you do a more measured reading of Lucretius, you can see that in many ways we haven't come all that far.  We do a lot of cherry-picking of things in Lucretius that sound similar to today's ideas and thus seem particularly insightful.  But it is not clear that they were more than a mix of subjective insight and, mainly, good guesses--after all, there were competing theories of the nature of Nature even at the time.  And other areas of Epicurean thought, well, are just not mentioned by those remarking on their apparent modernity.  Selective citation gives an impression of deep insight.  Most of De Rerum Natura was simply story-telling.

In many areas of science, perhaps even some aspects of fundamental physics and cosmology, but particularly in the social and even aspects of evolutionary sciences, we still make careers based on plausibility story-telling.  Our use of mathematics or statistical methods--random surveys, questionnaires, arguing by analogy, and so on--and massive data collection, give the same sort of patina of professional wisdom that one can see in the rhetoric of Lucretius.

We tell our stories with confidence, assertions of what 'must' be so, or what is 'obvious'.  Often, those interested in behavior and psychology are committed to purging religious mysticism by showing that behavior may seem immaterial but that this is an illusion, and purely material evolutionary and genetic explanations are offered.  No 'free will'!  The world is only a physical reality.  The role of natural selection and competition in explaining even morality as a material phenomenon is part of this, because Darwin provided a global (may one say 'Epicurean'?) material framework for it.  Evolutionary stories are routinely reported to the public in that way as well.  Even if some caveats or doubts are included here and there, they are often buried by the headlines--and the same can be found in Lucretius, over two thousand years ago.

Explanations of physical and behavioral variation and its evolutionary causes, along with many 'adaptive' stories making forcefully asserted plausibility arguments about what evolved 'for' what, still abound. They are not just told on television--we can't really blame Disney and Discover for appealing to their audiences, because they are businesses; but the same stories are in the science journals and science reportage as well.  We see tales every day reporting miraculous discoveries about genetic causation, for example.  It is sobering to see that, in areas where we don't have a really effective methodology or theoretical basis, we are in roughly similar sandals as our ancient predecessors.

Cherries; Wikipedia
Intoxicated by the many genuinely dramatic discoveries of modern, systematic science, we do our own cherry-picking, and tend to suspend judgment where findings are less secure, dressing our explanations in sophisticated technological hand-waving.

When we don't have actual good explanations, we make up good-sounding stories, just as our forebears did, and they're often widely accepted today--just as they were then.

Tuesday, September 3, 2013

Let's use evidence (not intuition, semantics, politics, or dogma) to navigate 'belief,' 'knowledge,' and 'science.'



Recently, Adam Blankenbicker asked me to contribute my thoughts for a post he was preparing on "believing in science." Here it is.

This is an important discussion for many reasons. And I have lots of opinions. Sometimes they're so strong I get to read them on NPR! Sometimes they're so strong that my teeth squeak when I hear a teacher quoted as saying, "I’m here to show you the evidence. If you want to believe the evidence when we’re done, that’s up to you." This kinda kills me just a tiny bit and I have to remind myself that quotes like these are plucked out of a much richer context that's omitted entirely.

But I actually hesitated on whether to respond to Adam's invitation to comment for his post because I'm writing a book right now that turns out to be hugely relevant to this knowledge/science/belief issue and (this is the kicker) it's relevant only because of the immense evidence-based, scientific journey that led me write [sick] up to this issue.

In other words, I hesitated to respond to Adam's email because I am uncomfortable with describing my present state-of-mind on this issue without first leading a person through the steps it took to get me there. The evidence! And those steps are about 60,000 words high and counting...

Regardless, I couldn't resist writing back to him. I knew mine probably wouldn't be the answer that he or at least most of his readers would warm to, but I just had to attempt to get across my discovery (yes, that's what it feels like!) that belief and knowledge aren't so distinct (or maybe aren't distinct period). I saw my response as sort of like a little test to see how it would float out there...

And it kinda sunk.

Let me show you...

Here's just the meat of the email with what I was asked to respond to:
If you have a few minutes, could you provide me some of your thoughts?
Why shouldn't I say "I believe in science"? What should I say instead to express the idea that I accept science? As a process or just a "thing".
Here's my klunky response (given some new punctuation for clarity here):
I'll answer your question as if you were told by someone else (not me) that "you shouldn't say 'I believe in science'" and that after they told you that, you came to me for help in understanding why they said that.

...Maybe because science isn't an entity, it's a perspective. It's also a process that's part of that perspective to arrive at knowledge that fits into that perspective. My This I Believe essay is about the difference between "believing in" something and "believing" something. And all I'd have to say about that is in that essay already.

Not all science-minded folks liked my essay because many think that "to believe" is different than "to know" because "knowledge" to many is based on facts and "belief" is not, so the verbs knowing and believing are therefore different. I don't agree. Even if some things can be distinguished as belief vs. knowledge, the possession of those things is believing/knowing for both the wrong (or completely evidence free) beliefs and the beliefs based on facts. Both can be just as real for the person who holds them so what's the difference? And when you think about all the "knowledge" that's passe and that's been overturned during the history of science, and when you do some serious reading about history and cross-cultural beliefs and knowledge, it's easier and easier to accept that these distinctions we make as scientists are cultural just like any other tribe's when they're describing their own system versus another.
Here's how my response was presented in the post:
I reached out to Holly and she told me that there were a number of “science-minded” individuals who did not agree with her essay. They “think that ‘to believe’ is different than ‘to know’ because ‘knowledge’ to many is based on facts and ‘belief’ is not, so the verbs knowing and believing are therefore different.” Where I agree with this perspective, Holly disagrees. But she goes on to say that just having the belief or knowledge is fine, not matter what word is used.
The delicate issues I tried to briefly convey are not included in my quote. Those parts that he says he disagrees with (the parts I italicized in my email up there) are not included and are poorly paraphrased.

Basically, my quote is plucked out of a much richer context that's omitted entirely! 

Am I crazy for posting this? Probably a little. But it's an issue I care very very deeply about...and more now than ever with this journey that I've taken while writing my book. I may be too sensitive, but I've had my mind blown by evidence this summer and it's led me to see knowing/believing and knowledge/science/belief in new ways. As my mind is all exploded right now, I'm not exactly composed about these things--not that I was ever very composed about much in the first place.

Wednesday, March 6, 2013

Ignoring our ignorance

Seeing more
Two pieces interesting in their own right herein collide.  Here's a video from the NYT about an enhanced technique that allows the viewer to detect motion that is invisible to the naked eye -- or camera.



This new technology potentially has clinical usefulness, according to the developers, and if so, that's a good thing.  But it's also interesting as a stark reminder of how much we can't and don't see around us.  Granted, this technology primarily makes visible things we can detect with more familiar (and cheaper) techniques (like stethoscopes), but still, new ways of seeing -- your heartbeat right there on your face -- are not only eye-openers, they're also brain-openers.

Seeing less
Here's another reminder that we don't always see what we are looking at.  A paper in the February issue of BioEssays, "When peers are not peers and don't know it: The Dunning-Kruger effect and self-fulfilling prophecy in peer-review," Sui Huang, is a discussion of the problem of peer reviewing by reviewers who don't recognize that they don't understand what they are being asked to evaluate.

The Dunning-Kruger effect is when people don't recognize that how much they don't know about a subject, and rate their own abilities or knowledge higher than merited.  This has implications in academia, as this paper points out; when peers aren't in fact intellectual peers, they can determine the fate of a paper, or grant application, without even recognizing that they've missed the crucial points.

The Dunning-Kruger effect is happening in science more and more, according to Huang, for a number of reasons.  One, perhaps a reviewer doesn't recognize the specialized use of a word, and assumes its colloquial or older meaning instead -- chaos, or epigenetics are two examples Huang cites.
Failure to consider the “other” meanings of a term prevents the recognition of one's own ignorance of concepts used in other fields. Second, because of the parceling of science into small kingdoms, authors often are the sole authority in their province with no equal. Finally, the increasingly interdisciplinary nature of research creates an asymmetry of knowledge: the reviewer as a single person faces the daunting combined knowledge of an entire team of coauthors. Thus, statistically, we can safely accept our first claim and assume that on average, reviewers nowadays are with high probability less knowledgeable about the subject matter of a manuscript than its authors.
Of course, there are many other possible reasons that papers or grants don't get adequately reviewed.  Editors don't know everything either, and don't know everything that potential reviewers don't know, so there's potentially a lot of ignorance determining the shape of other people's careers.  And they are overloaded trying to find knowledgeable and willing reviewers, and are in no position to really judge the reviewer's formal qualifications, patience, meticulousness in reviewing, or, of course, understanding of a particular paper.  Online publishing (or inclusion of essentially limitless Supplemental information) overwhelms reviewers.  Grant reviewing takes another toll.  And paper authors aren't always clear about what they did or how they did it--and there's a lot of intentional obfuscation, too.  But this is really a topic for another time.

Huang speculates as to why reviewers might not recognize the limits of their knowledge.  Pride (though this would perhaps indicate the limits recognized but not acknowledged to others), self-deception, simple ignorance -- if you don't know that 'chaos' has a specialized meaning, there's nothing to recognize.

The Dunning-Kruger effect is always true and always has been
But of course, ignorance of their own ignorance isn't particular to reviewers.  Aren't we all subject to the Dunning-Kruger effect all the time, bumping up against the limits of our knowledge but drawing conclusions anyway?  The Earth was once flat, the sun once revolved around the Earth, the continents were stationary, evolution wasn't true.  No one ever worries that they don't know as much as people will in 50, 200, 2000 years, and therefore decides she has no business making observations and drawing conclusions.

It's humbling, and sobering.  Or, it would be, if we weren't all busy ignoring our ignorance.

Sunday, September 27, 2009

Adam’s rib and the sanctity of Knowledge

Anne and Ken are still away, but they’ll be back very soon. When they are, I hope they’ll write about this new Nature article.

In the meantime…

********

Hi. My name is Holly Dunsworth. I’m a professor. And I thought that men had fewer ribs than women until halfway through college.

HI HOLLY!


Fantasizing about group therapy sessions may be blowing things out of proportion. But I wonder if there are any serious ramifications to learning the story of Adam’s rib from the Book of Genesis.

What’s the big deal about thinking men have fewer ribs than women?

I guess I can think of a few medical situations that could go horribly wrong if the doctor miscounted the ribs - mainly if she was using a particular rib to draw an incision or to locate an organ. But doctors learn basic anatomy (e.g. that men and women have the same number of ribs) before they are allowed near patients. I’m guessing that a disastrous rib incident is more likely to be found in the pages of The Onion than in The Times.

However, from a larger philosophical perspective, I think it is a big deal.

First of all, if you think the little story of Adam’s rib has gone away, you’re going to be surprised when you click here and here.

The question is definitely out there and most answers are truthful, but many perpetuate the myth, some quite elaborately.

The story of Adam’s rib symbolizes a puzzling phenomenon in America. We systematically teach myth-information, and then, later, we may or may not replace it with long-known facts. I wish I could say this was only something we do to children, but it’s also one of pop culture’s favorite ways to “educate” adults.

For example, The History Channel - which is widely assumed to be educational - bookends shows about Lincoln’s assassination with ones on Nostradamus and the Loch Ness monster. Even when these shows include the skeptical side of the story, they still validate the pseudo-scientific point-of-view, as if each holds equal footing in some sort of “debate”.

We love to spread myths that have been overturned by facts. What ridiculous behavior for any species, let alone for upright, intelligent, apes who should know better! For thousands of years we have been systematically replacing fiction with fact, unknown with known, fake with real. This is a triumph of humanity. It’s a feat that the perpetuators of Adam’s rib don’t appreciate.

I’m not saying that myths and folk knowledge are unimportant. They’re culturally relevant and crucial to our history. They are sources of beautiful songs, poetry, and literature. We are a story-telling species. But when those myths can be made into scientific hypotheses, which can be tested, and when they’re tested, and when they fail to be supported by the evidence… that knowledge should be replaced by the new knowledge that better explains the world around us.

There is no room for debate on this. Whenever they are discovered, real facts should replace false ideas.

What’s more, we shouldn’t expect each human to relive that entire process of discovery and falsification throughout her lifetime. Each new person deserves to start life standing on the shoulders of her predecessors so that she can leap off and fly above and beyond them.

We’ve known for a while now that developing embryos and fetuses don’t go through all the developmental stages of their ancestors before they’re born. So since somatic ontogeny doesn’t recapitulate phylogeny, why must we insist that intellectual ontogeny does?

Sure you can’t learn calculus without first mastering arithmetic and then algebra, but no one’s asking Kindergartners to design their own numbering system before they get started. Scientists and medical professionals need to single-handedly explore human anatomy as if it’s unknown territory; everyone else, including my elementary school self, deserves to second-handedly learn the truth about their bodies, about something as simple as rib number.

To me, the story of Adam’s rib is illustrative of the rising anti-intellectual epidemic which causes everyone, not just the afflicted, to suffer. That we continue to teach myth-information to children today, after we have accumulated so much Knowledge, is largely because of the disregard and negligence bred by anti-intellectualism.

Although it may be tempting to blame anti-intellectualism on religion, we should not boil it down to that, especially since so much Knowledge throughout history has been, and continues to be, accumulated by religious folks and institutions. Recently Gregory Rodriquez, citing Alexis de Tocqueville, argued in the L.A. Times that our shared American sense of equality is at fault.

Knowledge is not a democracy, it is a meritocracy. Good ideas hold. Bad and out-dated ideas are kicked out. And all ideas, bad and good, beget exponentially more of them, so as the information available at our fingertips balloons, we have to be even more careful with Knowledge.

Yesterday a student asked me something that I didn’t know the answer to, so I said “I think it’s probably [a, b, and c], but you’d have to check [x, y, and z]."

To that she replied, “You could have just said [a, b, and c] like it was the answer and I’d have believed you.”

I told her that was nice but that I lost my right to bulls_ _ _ when I became a professor, that now I can only speak in facts and hypotheses and I have to be perfectly honest when I don’t know something.

One thing I know for sure, however, is that men and women have the same number of ribs.


- Holly Dunsworth, guest blogger

Tuesday, April 14, 2009

The Remarkable Fact That We Actually Know Anything!

Scientists always say they are working at the frontier of knowledge (we usually increase the self-praise by calling it the 'cutting edge'). But that's a trivial way to express vested interest, really, because things that are known are not being explored by science, so in a sense it's the definition of science to be studying what we don't already know.

On the other hand, what we do know is rather remarkable when you think of the complexity and elusiveness of Nature. DNA and molecular interactions can't be seen the way ordinary objects and interactions can. We are dealing with very large numbers of very small particles interacting in very many ways. In fact, everything genetic, genomic, and cellular turns out to be related to everything else (to oversimplify a bit).

Yet, almost no matter what you may ask about, Googling will reveal a substantial, usually rather huge, literature on the subject. Nonetheless, the subject isn't closed, the problem not 'solved', and the complexities are manifest.

One can ask, for example, about the genetic involvement or cause of a disease, even a rare disease of variable or multiple symptoms, and find that something is known about its molecular cause.

It may be a neurotransmitter problem, or an energy metabolism one, or a developmental anomaly, etc. Variants at some gene(s) are usually known that 'cause', or at least are involved with, the trait. Yet if you dig deeper, the stories are not very tight. Prediction from gene to trait is, with some usually-rare exceptions, not that strong, and often treatment, and almost always prevention, remain elusive. Is this because we just haven't gotten around to figuring these things out, or because we don't know how to know them? Do we need a new way to think about complexity?

It's remarkable in many ways that we know anything about these problems, even if it's equally sobering how difficult it is to truly understand them. It's a combined testimony to the power of research methods, the army of investigators employing them, but also the way in which these methods reveal facts as well as uncertainties. Often the tag line in papers or the news is about what we know, and what we don't know is kept quieter. Some believe that with time and resources, our methods and technology will finish the job. Others would say that if we believe that, we are not really accepting and dealing with complexity head-on as we should. And, as we wrote yesterday, it's always sobering to realize that the assumptions we base our knowledge on may themselves be faulty--but we never know at the time which ones they are or what's wrong with them.

There's probably no one way to view this. But, still, it's amazing how much we do know about things that are so little and complex.