Showing posts with label Francis Bacon. Show all posts
Showing posts with label Francis Bacon. Show all posts

Wednesday, November 16, 2011

Who is this magical "third person" doing the science?

Writing in the third person and passive voice is traditionally the preferred style for publishing in many scientific journals.

But you’ll often come across a research article that’s written in the active, first person. I hadn’t seen one in a long time until yesterday and at the drop of the first “I,” alarms went off. After I calmed down I wondered, What’s so terrible about writing I and my?

I’ve decided that the answer is nothing.  It’s fine.

In fact, after you become accustomed to reading studies like this written in a much more modern active tone, articles written in the conventional way sound borderline ridiculous. That is, strict adherence to the passive voice and the strange use of "we" by a sole author...these practices can read pretty ridiculous.   

People complain about how awkward scientific writing is, so that’s one count against it. But beyond inconvenience there’s another problem with conventions of scientific writing.... their dehumanization of science. 

This third person passive "rule" was supposedly sparked by Francis Bacon to inject objectivity into scientific writing. But does it really do that? I’m inclined to think it's also likely to obscure objectivity by hiding weak or shoddy research in something larger, in something less accountable.

By sticking to the first person active perspective, you’re reminding yourself along with everyone who reads your study that a human with limits and biases performed it. This is fair, open, forthcoming, and can be very honest and humbling (depending on the author)… all these are beloved virtues of science and scientists. 

But transform an active, first person article into one with only a passive voice and the science reads like it's above and beyond mortal human business... as if an external force guided the author to do the research or that she's merely the recorder of a supernatural science project that was magically conjured in her laboratory.  
"The experiment performed itself!"
By dropping the third person passive voice, scientists can avoid giving the impression that their work transcends earthly constraints and that it is greater than what a human (or a mere first person) is capable of.

Scientists are science-doers, not science-whisperers and they should be able to report their work as objectively as possible, as close to reality as possible, not according to this or that grammatical preference.
***

I'm not sure very many MT readers (including myself, tomorrow) are going to agree with this post, but this is something I'm thinking about today.   

Tuesday, December 28, 2010

Science for sale: where we are and how we got here

Well, our flight to France was cancelled -- twice, and then finally rescheduled for so far into our trip that it didn't make sense to go.  So, you're stuck with us now for the duration -- but that's ok (sort of); science marches on, and we'll keep marching with it.  

For a discussion of the origins of our era of entrepreneurial science, in the context of the industrial revolution, listen to the first of a 2-part series of the wonderful BBC Radio program In Our Time. The second part will be aired the last week of December.  In Our Time, every Thursday except in summer, is a pleasure and educational wonder of our intellectually threadbare media time--listening regularly is like getting a college degree, without having to pay any tuition!  The discussion is usually calm and congenial, but the first installment on the Industrial Revolution in Britain got pretty steamed up....and not just about the role of the steam engine, or the inventors of the steam engine, but about contesting views of the nature of, and proper course of, society that we still see today in society, and in science as well.  The discussion is well worth listening to.

The industrial revolution, which mainly occurred in Britain, grew out of the Enlightenment period, of the overthrow of medieval and Classical concepts of a static universe that could be understood by thought and deductive reasoning. Led by the giants Galileo, Newton, Descartes, and many others famous and otherwise, this period ushered in an era of empiricism, an era in which we still live.  The Enlightenment, a largely Continental view that the new kinds of knowledge--that has morphed into institutionalized 'science'--could enable society's problems of suffering and inequity to be relieved through a better, more systematic understanding of the real, rather than ideal world.

Francis Bacon is credited with introducing the scientific method's reliance on induction--repeated, controlled observation.  In an In Our Time installment in 2009, Bacon's reasoning was discussed:  he felt that science could be put into the service of the nation, to exploit the colonies and gain international political and economic dominance.  We're living that legacy still, as many scientists argue --  believe -- that knowledge can only be called 'science' if it can lead us to manipulate the world.

Part of the debate is one that threads through the 19th century and exists still today:  did major advances come because of the stream of culture, or because of the genius of A Few Good Men?  Associated with that is the contrast between the view that cultural, including scientific advances belong to and are enabled by society, vis-à-vis the view that individual self-interest is the source and should receive the rewards of technological advance.  The industrial revolution led to great progress in many different areas of life, but also to great misery in many different lives.  In turn, that spawned the contest between capitalistic and socialistic thinking.  In its stream were Marx's view of history as a struggle between material interests, Darwin's of history as a struggle between individuals, and many more.

In the US, figures like Bell and Edison led the way in commercializing, and publicly hyping science, and in setting up  research laboratories aimed at industrial commercialization of ideas.  In our age, the comparable questions include whether genetic research should be publicly funded, and if so, should resulting royalties as well as new products go back to the public?  Should genes be patentable?  Who owns human embryos?  If the technicians, students, or other workers make the biotech inventor's work possible, why are they paid so little relative to him or her? Should research funds be put into areas that will yield commercial products at some vague future time, or should the funds--that come from taxes--be used to improve nutrition and vaccination of people here and now?  Should NSF and NIH be pressured to see that a criterion for the science they fund be that it can be quickly commercialized?

To what extent should science be for sale?  How much is owed to scientific discoverers?  Indeed, how much credit should discoverers actually be given individually, rather than being viewed in corks floating on the ideas of their time?  Should science be supported on the basis of its commercial potential?

The product of specific inventors, or the specific products of the times?
The industrial revolution involved many inventors, who improved technologies  including looms, shipping, iron, steam, rail, and other aspects of the mechanization and industrialization of life.  Step by step, innovators invented, tinkered with, and learned how to apply all sorts of new or improved techniques, machinery, and manufacturing technologies.  The explosive growth of machinery-based industry that resulted transformed rural populations to urban proletarians, who depended for their survival on the products of industry rather than their own plots of land.  Government made Britain's industrial advance possible through tax policy, the Royal Navy, the captive market of the Empire, import restrictions, banking laws, and in other ways.  These policies nurtured, stimulated, and enabled the individual incentives of countless major and minor tinkering inventors (the equivalent of today's biotechnology innovators) to make their ideas and market them intellectually and commercially to make their livings (and to dream of riches).  But how much credit actually belongs to the inventors and how much is owed to the workers who implemented inventions?

The debate over whether history is a cultural stream or whether it's transformed periodically by Great Men is a serious debate.  For most ideas credited to The Great Genius, others can be found who at the same time or earlier had similar ideas for similarly good reasons.  Darwin had his Wells, Wallace, Mathews, Adams, Grandfather Erasmus Darwin, and others.  Newton his Leibniz.  Einstein his Poincaré.  If you're in science and have an original idea, you can be sure that if you hunt around in the literature, you'll find others expressing the same insight.  It's a humbling experience many of us have had.  Without Watson and Crick, when would the structure of DNA have been discovered--eventually, never, or right away by Linus Pauling or Rosalind Franklin?

The cultural stream vs Great Man theories of history have been interesting questions in anthropology for a long time.  It's about how culture works as a phenomenon, and among other things how science works as a way of knowing the world, and about how moral decisions are made about social equity.  Maybe it's something appropriate to think about at this holiday time of year.

And if you want to know more, and didn't get a good book for Christmas, nestle down by a nice warm fire, with a brandy, and open a little story called War and Peace.   It asks how important Napoleon was to Napoleonic history.

Tuesday, January 12, 2010

Knowledge is Power

At the dawn of the modern science era, in 1597, Francis Bacon, a founding empiricist, used the phrase 'knowledge is power'. To Bacon, "knowledge itself is power", that is, knowledge of how the world works would lead whoever had it to extract resources and wield power over the world--science would enable Empire.

This view of science has persisted. It was important in the early founding of the Royal Society and other prominent British scientific societies in the 17th and 18th centuries and beyond. The technology and even basic knowledge that was fostered did, indeed, help Britannia to rule the waves.

Basic science was the playground of the classic idle wealthy of the 1700s and surrounding years, and applied technology was developed by people not formally beneficiaries of 'education' as it was done in those times. In the US, major scientific investment, such as in large telescopes, was funded by private philanthropy--of wealthy industrialists who could see the value of applied science.

We tend perhaps to romanticize the 18th and 19th centuries, the era of Newton, Darwin, and many others, who advanced science in all areas--geological, physical, chemical, and biological, without doing so for personal or financial gain. But at the same time, there was much activity in applied science and technology and even in 1660 when the Royal Society was founded with government support, gain was one of the objectives.

An informative series about the history of the Royal Society and of other scientific activities in Britain was aired the week of Jan 4 on BBC Radio 4, on the program called In Our Time--the four parts are now available online. Much of the discussion shows that the interleaving of government funding, geopolitics, and avarice were as important when the Royal Society was funded, as now, in driving science.

There can be no doubt about the importance of systematic investigation of the ways of Nature in transforming society during the industrial revolution. The result was due to a mix of basic and applied science. The good was accompanied by the bad: daily life was made easier and healthier, but episodes of industrialized warfare made it more horrible. On the whole, it has allowed vastly more people to live, and live longer, than ever before. But it's also allowed vastly more people to struggle in poverty, too. (The discovery of novocaine for use by dentists may alone justify the whole enterprise!)

The post-WWII era seemed to foster lots of basic science. But in the US the National Science Foundation and other institutions poured money into science largely, at least, in response to the fears that the Soviet Union whose space program was far ahead of ours, might gain on us in world prominence. So there was a recurring pragmatic drive for supporting science.

The university dependence on research grants was one of the beneficiaries of this drive. We think this has been awful for science, since careers depend on money-generating by faculty, and that leads to safe, short-term thinking, even if more funds mean more opportunity. The intellectually thin Reagan administration's demand that research should translate into corporate opportunity was just a continuation of the materialistic element of support for science.

In a way, we're lucky that basic science, disinterested science actually got done, and lots of it at that! Human society probably can't be expected to put resources into things so abstract as basic science, with no promise or obvious way to lead to better pencils, medicine, or bombs. So it's no wonder that universities, bureaucracies, and scientists alike hype their personal interests in terms of the marvels to be returned to the funders.

Such a system understandably leads to entrenched vested interests who ensure their own cut of the pie. We routinely write about these vested interests and the effect we believe they have on the progress of knowledge. But, as anthropologists, we have to acknowledge that the self-interest that is part of the package is not a surprise. After all, why should we be able to feed off the taxpaying public without at least promising Nirvana in exchange? Human culture is largely about systematized resource access and distribution, and this is how we happen to do that these days.

Under these conditions science may not be as efficient or effective as it might otherwise be. A few MegaResearchers will, naturally, acquire an inordinately large share of the pie. Much waste and trivia will result. The best possible science may not be done.

Nonetheless, it's clear that knowledge does progress. A century hence, it will be our descendants who judge what resulted from our system that was of real value. The chaff in science, as in the arts, sports, or any other area of life, will be identifiable, and will be the majority. But the core of grain will be recognized for its lasting value and impact.

BUT that doesn't mean we should resign ourselves to the way the system works, to its greed, waste, hierarchies, and its numerous drones who use up resources generating incremental advance (at best). That is part of life, but only by the pressure of criticism of its venality and foibles can the System be nudged towards higher likelihoods of real innovation and creativity in knowledge.

It's remarkable that blobs of protoplasm, evolved through molecules of DNA and the like from some primordial molecular soup, understand the universe that produced it as well as we actually do. And we will continue to build on what we know; empirical, method-driven activity is a powerful approach to material gain. Embedded in inequity, vanity, venality, and other human foibles, we nonetheless manage to manipulate our world in remarkable ways.

The history of the Royal Society and other science societies that reflect the growth of society generally, as reflected in these BBC programs, is a fascinating one. But that doesn't change our belief that, in principle at least, we could make better use of our knowledge and abilities to manipulate our world toward less inequity, vanity, venality and so on.