Miasma, Malaria, and Method
Global fears of infectious disease have taken on renewed urgency during the past year. As Jeanne Guillemin points out, such
fears are as old as the practice of medicine itself, and the struggle to confront the rational threat of disease (as well
as the irrational fear) is, likewise, nothing new.
The notion that miasma (i.e., the stench of decaying matter) caused epidemics was for centuries a powerful theory in Western medicine. It particularly dominated explanations of malaria outbreaks, from the first century AD, when Hippocrates wrote his treatise, On Air, Water, and Places, until the end of the nineteenth century, when the mosquito vector hypothesis was verified. Along the way, “scientific” theories—about noxious conjunctions of the planets, impending storms or precipitous weather changes, and even the effects of floating mushroom spores—embellished the central idea that invisible dangers lurk in the air.
Bad smells in particular signified air as a threat. Entire marshy areas in Europe, especially in Italy and France, emanated putrid odors that were held responsible for seasonal malaria among nearby inhabitants who made their living as fisherfolk and farmers. The name we use for the disease, malaria, literally means “bad air” (mal'aria) in Italian, whereas the French term is paludisme, from the word for marsh (palud). Whole communities could seasonally count on children and adults wasting away from the debilitating sickness. Antonio Verga, author of Cavalleria Rusticana, describes these families in haunting detail in the stories of his native Sicily (1). To outsiders, these areas were “landscapes of fear,” as the historian Yi-Fu Tuan describes them, to be approached at risk (2).
With the growth of densely populated European cities, malaria also felled thousands of urban dwellers. Outbreaks of malaria, cholera, and fevers in general were attributed to the miasma from open sewers, slaughterhouse offal, dumps, graveyards, and generally stench-filled areas where the poor lived. For wealthy city-dwellers, solutions were sought in enclosed city gardens, retreats to country estates, and big investments in perfume.
The miasmic theory of disease, however wrong-headed, had some good consequences. It spurred early urban public health efforts to eliminate squalor and make water potable. As a side effect, it reduced the reservoirs of stagnant water in which mosquitoes could breed. Malaria epidemics, however, still occasionally devastated European cities, especially when new canal and road projects disrupted waterways and land. The 1865 rebuilding of Paris by the Baron von Haussmann, for example, inadvertently created water-filled ditches and caused a major malaria epidemic. Nevertheless, Europe's problems with malaria were ending.
In the latter part of the nineteenth century, at the height of European imperialism, malaria became recognized as a tropical disease and as a threat to imperialist interests in foreign territories. The protection of administrators from malaria was less a problem than the protection of soldiers. The powdered form of cinchona, or Peruvian bark, was a known medicine since the seventeenth century. In the first half of the 1800s, its four basic alkaloids were isolated, paving the way for industrial production of various salts of quinine. In 1854, the Dutch started their cinchona plantations in Java and within a few decades, with improved bark yield from the Cinchona ledgeriana, The Netherlands was the world supplier of quinine. The saying that “Africa was colonized by quinine” rings true. But a major hurdle was the effective distribution of quinine to troops, which was expensive. Furthermore, soldiers were not trusted to take the proper doses, particularly because the effects were sometimes erratic, for reasons that could only be guessed.
“…[F]ew dappled-winged mosquitoes could be found, as the rainy season had not yet commenced; while to my grief I discovered
that the plague-scare was, if anything, stronger here [in Darjeeling] than in Calcutta. So terrified were the natives, that
on one occasion, when one of my men shot a sparrow for me in the village, all of the coolies in the neighbourhood ran away
for miles into the jungles…”
Ronald Ross, Researches on Malaria
The modern science of malaria begins with a sick soldier posted in a French colony, Algeria. In 1880, the French military physician Alphonse Laveran observed the motile flagellum of the malaria plasmodium in a sample of the soldier's blood. Although he did not understand the novel organism as the male form, he was sure it was a new “animal,” and not a bacterium (3). Researchers at the Pasteur Institute in Paris, deeply invested in germ theory, attacked Laveran, protesting that they could not find his plasmodium under their microscopes. Fortunately, Laveran's former pupil in military school was Émil Roux, Louis Pasteur's prized associate. Roux arranged for Laveran to demonstrate exactly his method of observation. Pasteur and his researchers were converted. Inspired by Laveran, the Russian biologist Danilevskiy searched for and discovered a variety of plasmodia in birds.
In 1898, nearly twenty years after Laveran's discovery, the life cycle of the plasmodium was elaborated. This discovery, too, depended on two physicians with experience in tropical medicine, a specialty that was then generating much new knowledge about insect-borne diseases. In his Asian research, Sir Patrick Manson, later known as the “father of tropical medicine,” had demonstrated that the parasite of filariasis (the disease that causes elephantiasis) was drawn from human blood by female mosquitoes and continued its growth within the mosquito's abdomen. Manson (derisively labeled “Mosquito Manson” by his detractors) conjectured that humans caught the disease by drinking water polluted by dead mosquitoes and their eggs. In 1896, Ronald Ross, on leave from medical service in India, visited Manson in London. Ross wanted to discuss his pet idea that malaria was caused by intestinal germs. Instead, Manson showed Ross the flagellated organism under the microscope, and persuaded Ross to embark on mosquito–malaria experiments in India.
Early on, Ross jettisoned Manson's polluted water idea, which Alphonse Laveran also mistakenly shared (4). In 1898 Ross conclusively demonstrated the mosquito's role as vector and in careful drawings, he illustrated how mosquitoes that fed on sick birds infected healthy birds with malaria. He was able to trace the life cycle of an avian plasmodium from its asexual forms in the warm-blooded vertebrate through its proliferation in the gut of the mosquito until it accumulated in the salivary glands of the female mosquito, ready for the “inoculation” of its avian host. Ross thought his bird research stood firm without human proof of the mosquito vector. Patrick Manson thought otherwise. In July 1900, with his medical student son as the guinea pig, Manson set up a test with Ross's Italian competitor, physician and zoologist Giovanni Grassi, to show that diseased mosquitoes from the Roman Campagna, shipped by diplomatic pouch to London, could infect a healthy human with malaria (5). In this case, the patient was quickly and successfully treated with quinine. Ross never forgave his mentor for nailing down this final piece of the puzzle.
Ross's innovative work earned him a Nobel Prize in 1902. Laveran's Nobel, for his life work on protozoan diseases, came in 1907. Together, with help from Manson, Italian researchers such as Grassi and Amico Bignami, and also the Canadian physician, W. G. MacCullum, the two physicians laid the groundwork for understanding the plasmodium life cycle and ultimately how chemotherapy could attack the parasite.
Ross went on to a position at the Liverpool School of Hygiene and Tropical Medicine, where he advocated broad public health campaigns to fight malaria. In September, 1898, two months after his famous discovery lay waste the notion of miasma, he wrote a colleague in Germany (6): “The prevention of malaria promises to be an easy matter, at least [a]round houses, camps, etc. Mosquito nets must be used, and the mosquitoes got rid of.” Ross felt that prevention of standing water in any vessel, ditch, or puddle was key, and that those with malaria “should always sleep in mosquito nets for fear of spreading infection.” Grassi personally mounted such vigorous and successful efforts in Italy. But Ross found the British colonial administration balky. As long as its administrators and medical service felt protected by quinine, could recruit sufficient healthy plantation and mine workers, and had no large garrisons to maintain, they showed little enthusiasm for preventive measures. In colonial Africa and Asia, the landscapes of fear —regions and villages whose inhabitants were stigmatized as malaria sufferers—were often too remote and insignificant to justify grand, humanitarian programs.
The wars of the twentieth century brought urgency to programs against malaria. During World War I, soldiers from the colonies imported the disease even to northern Europe, where mosquitoes proliferated in the trenches and throughout the ravaged landscape. Both Ross and Laveran worked diligently to educate overseas troops to use mosquito nets at night and to take their quinine doses. Educating army physicians and officers to enforce new behaviors proved difficult, but the stakes were high.
Meanwhile, cut off from Dutch and British supplies of quinine, the Germans concentrated on synthesizing antimalarial chemical therapeutics. Using canaries for their experiments, they pushed the chemotherapy frontier forward after the war with the 1928 discovery of Plasmochin (pamaquine). More effective against avian plasmodia than human ones, and ultimately too toxic, it was nonetheless a start.
In World War II, the Allies' loss of access to the Japanese-occupied Indonesian plantations also forced more exploration of quinine alternatives. In 1942–1943, tests on nearly a thousand Australian army volunteers showed that Atebrin, a compound developed and tested by the Germans in the 1930s, could be taken for months without ill effects. As Mepacrine, the new compound was produced in volume in the UK and US and routinely used by troops in Southeast Asia and the Pacific. As malarial expert Leonard Bruce-Chwatt concluded (7): “There is no exaggeration in saying that this probably changed the course of modern history.”
In 1942, after the Allies smuggled a sample of the newly invented DDT (dichchlorodiphenyl trichloroethylene) from Switzerland, this insecticide provided the efficient means for soldiers to invade malarial areas, especially in the Pacific theater. DDT was simultaneously a boon to the American south, where malaria still occurred.
For a short while, in the late 1950s and 1960s, it looked as if malaria might be conquered worldwide by residual insecticides like DDT. But their misuse, fears of their long-term destructive effects on the natural environment, and evidence that some mosquito vectors were resistant to them shifted the emphasis back to chemical compounds, especially the cheap and effective chloroquine, a product of US wartime research involving the army, scientific institutions, universities, and pharmaceutical firms. Then, in 1960, the news of resistance to chloroquine in patients infected with P. falciparum, the most lethal of the four malarial species of plasmodia, began to come in from South America, Malaysia, and Southeast Asia. United States involvement in wars in Korea and Vietnam added new impetus to the chemical battles against malaria. In 1963, the US Army Research Program on Malaria began an enormous effort to find alternative compounds to protect soldiers against P. falciparum malaria. Later sulfonamides or sulfones were added to the mixtures, with some good effects against resistance. A malaria vaccine project was funded by the US in 1965 but later foundered.
In 1898, the discovery of the mosquito vector for malaria seemed to indicate an easy solution. The present suggests a more complex picture. The new nations that have grown from the old colonial structures are beset by enormous problems and enjoy few resources. Sprawling urbanization, widespread migration, violent revolutions and wars, the AIDS epidemic, and general poverty shape the context for malaria epidemics, in which one million people die each year and at least 300 million are debilitated (8).
Mitigating malaria—few talk of eliminating it—will require diverse technological approaches, old and new, from mosquito nets soaked in insecticide to transgenic mosquitoes, from the elimination of standing water to new chemical compounds and effective vaccines. Political will is essential. The international community, national leaders, local administrators and, above all, local inhabitants will have to be convinced of the importance of a combined common-sense and technology-driven approach.
- © American Society for Pharmacology and Experimental Theraputics 2001
Jeanne Guillemin, Ph.D., is Professor of Sociology at Boston College and Senior Fellow of the Security Studies Program at the Massachusetts Institute of Technology. She has spoken and written extensively on the sociopolitical exigencies of health care. Her 1999 book, Anthrax: The Investigation of a Deadly Outbreak, became available as an eBook in October 2001 through the University of California Press.