--------------------------------
Are there social determinants to malaria infection?
If you’re a social scientist you might be quick to say yes, but if you understand the biology of the disease the question may not make much sense to you.
A female anopheline mosquito feeds on someone carrying the sexual stage of the parasite. The blood meal gives her the nutrition necessary for laying her eggs. Assuming that the parasite has successfully undergone another transformation in the mosquito gut, and that the mosquito feeds on another person, she may transfer the infection. Mosquitoes probably don’t care about the socio-economic status of the people on which they feed (though they do seem to prefer people with stinky feet and pregnant women). It is probably safe to say that all other things being equal, mosquitoes really don’t care who they bite. But are all other things equal? Not even close…
Let’s consider our not-too-distant history with malaria in the U.S. since it was a plague of non-trivial proportions for a large swath of our nation. During the 1860s a prominent scientist (one of the first to publicly suggest that malaria may come from mosquitoes) argued for having a giant screen placed around Washington D.C. (which was a swampy, malaria infested city up until the mid 1900s).[1] Several of our presidents seem to have suffered from the disease. George Washington suffered throughout much of his life with bouts of fever that were likely malaria. Presidents Monroe, Jackson, Lincoln, Grant, and Garfield also may have suffered from malaria. On a personal note, both of my grandparents contracted malaria growing up in modern day Oklahoma (at that time it was still Indian Territory). My grandmother still drinks tonic water, which contains the antimalarial Quinine, when she feels a headache or chills today. The following maps (I apologize for the poor resolution) come from a CDC webpage about the history of malaria in the U.S.
CDC Malaria History
A question, then, is: How were we so successful at eradicating malaria here? Furthermore, why didn’t we do that everywhere else?!!!
A favorite story for many anti-environmentalists is that it was all or mostly because we used DDT. And beginning in the 1930s we did use the hell out of DDT. Apparently it was common practice for parents in the Southern U.S. to encourage their children to run behind DDT fog trucks as they drove down streets. (See this blog post for some related stories). But some real problems with DDT are that it doesn’t just target mosquitoes, probably also targets the predators that would feed on mosquitoes and other pests, and can potentially cause all sorts of troubles (with regard to bioaccumulation and/or biomagnifications) as it works its way through trophic levels. A few people noticed this could be a problem (see Silent Spring by Rachel Carson) and DDT production was halted in the U.S in 1972. (Soon after there were global efforts at banning its use for agricultural purposes).
But DDT wasn’t the only thing that changed in the U.S. during the Second Great War. The U.S. was just coming out of the Great Depression and there were some interesting demographic things going on too. For example, lots of working-aged males were away for the war, returned in masse, and then some major baby-making ensued. The economy was rebounding and suburbia was born, meaning that many of those baby-makers could afford houses (increasingly with air conditioning units) that wouldn’t have been possible in previous years. There were major public works projects aimed at building and improving drainage systems and sanitation.
During this same time period chloroquine, a major antimalarial drug with some important improvements on quinine, went into wide-spread use (mostly in the 1940s) but by the 1950s there were drug resistant parasite strains in Southeast Asia and South America. This isn’t a surprising occurrence. Antimalarials provide a pretty heavy selective force against the parasites. Furthermore, those parasites undergo both clonal and sexual reproduction, meaning they can potentially generate a lot of novel variants and strains. This has been the curse of antimalarials ever since, soon after they are rolled out the parasites develop resistance and resistant strains quickly spread globally.
Eradication of malaria in the U.S. occurred during a time when we were using heavy amounts of DDT, when we had access to relatively cheap antimalarials, and when we were undergoing some major socio-economic, structural, and demographic changes. However the DDT was becoming an issue of its own and wasn't working as well as it once did. The antimalarials weren't working as well as they once did either. Despite this fact, and despite the fact that mosquito vectors for malaria still exist in the U.S., we still don’t have a real malaria problem. And while it is almost impossible to tease out all of the contributors to our current malaria-free status, I argue that the social and economic factors that changed during this time period are the main reason why malaria is no longer a problem for us here in the U.S. If that weren't the case, we’d be back to using insecticides and antimalarials to try to eradicate it once again.
I’m certainly not the first to notice such things. A study on dengue fever (a mosquito-borne viral disease) in a Southern Texas/Northern Mexico town split by the international border (los dos Laredos) found that people without air conditioning units seem to have more dengue infections when compared to people who do.[2] Poor people, living on the Mexico side of the border, tended to leave their largely unscreened windows open since they didn't have AC units to combat the sometimes brutal heat in that part of the world. This is a clear example of how socio-economic factors can influence mosquito-borne disease transmission, but it plays out in other ways in other environments and parts of the world.
In Southeast Asia, where I do malaria research, many if not most of the people who are afflicted with malaria are poor, ethnic minorities and migrants who have been marginalized by governments and rival ethnic groups.[3] Constant, low-grade warfare in Myanmar (Burma) for the last half century has left many of the residents of that nation in a state of public health crisis. And, since pathogens don’t normally respect international borders, malaria remains a problem for neighboring countries such as Thailand (which is mostly malaria free when you exclude its border regions). The story is the same along China’s border with Myanmar in Yunnan Province. Mosquitoes don’t target people because they’re poor disenfranchised ethnic minorities. But a lot of those ethnic minorities do happen to live in conditions that allow malaria to persist, and the mosquitoes who pick up malaria go on to feed on other potential human hosts, regardless of their economic status. This means that your neighbor’s poverty can actually be bad for you too.
Arguably, most (not all!) public health advances can be largely attributed to socio-economic change (google: McKeown hypothesis). Increasing the standard of living for entire populations tends to increase the health of populations too. In Asia, nations such as Taiwan, Japan, most of South Korea (excluding its border zone with North Korea), and Singapore are malaria free. Obviously, it isn’t always an easy task to increase the standard of living for a population, but the benefits go far beyond putting some extra cash in peoples’ pockets and letting them have nice homes. The benefits include decreases in diseases of many types, not just malaria, and that is good for everyone.
Consider, now, the amount of money that is dumped into attempts at creating new antimalarials or that ever elusive malaria vaccine. Consider the amount of money that has been dumped into genome sequencing and countless other really expensive scientific endeavors. And then consider whether or not they actually have a lot of promise for eliminating or controlling malaria in places that are still plagued by this disease. Sure, sequencing can provide insight into the evolutionary dynamics associated with the emergence and spread of drug resistance (and that is really exciting). Some people believe that genomics will lead to personalized medicine, but even if this is true then I am skeptical that it will ever trickle down to the people that most need medical attention. New antimalarials and new combinations of antimalarials may work for a while. But it seems pretty obvious to me that what actually works over the long term, regardless of parasite evolution and genetics, is what we did right here in the U.S. So, at the risk of jeopardizing my own future in malaria research, I've got to ask:
From a public health standpoint, is it possible that it’s cheaper to attack socio-economic problems in malarious places rather than to have thousands and thousands of labs spending millions and millions of dollars for cures that seem to always be short lived?
Wouldn't we all get more bang for our buck if we took an approach that doesn't only address one specific parasite?
1. Charles, S. T. Albert F. A. King (1841-1914), an armchair scientist. Journal of the history of medicine and allied sciences 24, 22–36 (1969).
2. Reiter, P. et al. Texas lifestyle limits transmission of dengue virus. Emerging Infectious Diseases 9, 86 (2003).
3. WHO, Strengthening malaria control for ethnic minorities in the Greater Mekong Subregion. 2011, (2008).
This is a terrifically thoughtful contribution, Dan! It shows that even in something where causation should be strong and clear-cut, that we can't ignore either biology or environment (in this case, including culture) in understanding, much less manipulating, the phenomenon.
ReplyDeleteThis makes it a very fine topic for knowledgably anthropological study, as you are doing.
Somewhere, Frank Livingstone is smiling.
ReplyDeleteVery nice contribution to MT.
tx
For those MT readers who don't know who Frank was (besides being my PhD advisor at Michigan!), he was one of the most truly interdisciplinary anthropologists in our history.
DeleteHis 1958 paper in the American Anthropologist, now largely forgotten, combined linguistics, culture, genetics, epidemiology, evolution, and history in a single analysis that to at least a considerable extent accounted for the diversity and high frequency of human genes conferring resistance to malaria.
He was inspirational to study under, always thinking, never a prisoner of the grant or publishing rat-race system, and while technically sophisticated, not a narrow technocrat.
His work has stood up well under more recent studies of malaria-associated genetic variation in Africa.
Unfortunately, but typical, he was playing hockey into his dotage, and he never did anything at half-speed. As I understood it, he was checked quite hard, and decked onto the ice. And he never really recovered.
But he would have no regrets probably. He always resented his alma mater, Harvard, for saying he was too small to play on their hockey team.
Frank was one of the great ones in anthropology's hey-day.
And there's more about Frank. He did not just train proteges who were clones--graduate students taken on board to do his research for him partitioned into bits so he could pad his CV with papers.
ReplyDeleteI may be forgetting somebody, but I can't think of any of his students who did malaria work. He stimulated us each to do our own thing.
Thank you all for the compliments and for the chance to contribute to the blog.
ReplyDeleteAnd in my opinion, Livingstone was an excellent model for anthropology to follow. Many people talk about interdisciplinary work but few actually do it. Even worse, it seems as if anthropological research that works across anthropological subfields occurs even less. I think that is a shame...