In an earlier post, I outlined our main weapons against infectious disease, including vaccines, antibiotics, antiseptics, pest control, sanitation, and general hygiene. These technologies (in a broad sense, even hand-washing is a technology) have largely eliminated lethal diseases such as smallpox, malaria, cholera, tuberculosis, and polio, at least in the developed world.
But which of these technologies mattered most? Which should we highlight in a history of health and medicine, and which should we hold in our minds as major examples of human progress against disease?
Most histories of medicine give the spotlight to vaccines and antibiotics. They’re the most effective medical treatments; prior to their introduction, there was little a doctor could do for an infected patient.
But to really answer the question, we should look at mortality rates over time, by disease where possible, and correlate reductions in mortality to specific interventions effective against specific classes of disease. Otherwise we run the risk of assuming that just since something is in all the histories, it must be the most important (leading me to then feature it prominently in my history, thereby perpetuating the cycle).
So I started looking into the data and interpretations of it. And the surprising thing I found is that infectious disease mortality rates have been declining steadily since long before vaccines or antibiotics.
First, let’s look at the data that clearly shows that something was going on prior to any effective medical treatment for most diseases.
Here’s a chart of 20th century mortality in the US [1] (the spike around 1918 is a major worldwide influenza epidemic, sometimes known as the Spanish Flu):
And here’s a similar chart I made for England and Wales:
Mortality rates in both regions fell, and most of the improvement came from the infectious disease mortality rate, which was reduced by more than an order of magnitude.
Extending the analysis further into the past is difficult. Data is available back to the mid-1800s, but disease classification changes over time as scientific and medical knowledge advances, so it’s not always possible to trace the mortality rate from a single disease; and prior to the mid-1800s, no countries that I know of kept reliable cause-of-death records. We do have overall mortality rates from some countries stretching back into the 1800s and even 1700s in some cases, and from these we can see that mortality rates in Europe have been dropping for a long time [2]
To estimate population and mortality figures prior to these datasets, demographers turn to increasingly limited and unreliable sources, such as parish records of births and deaths, or the London Bills of Mortality, which began running continuously in 1603. These estimates are rough, but they generally show that death rates began to fall in some parts of Europe by 1740 (and in some parts possibly as early as 1670), [3] and that declines in disease mortality were a significant part of this.
In contrast to this timeline, very few effective medical treatments were in widespread use until the late 1930s. Before that time, only a handful of vaccines for major diseases were in use (most notably for smallpox); and there were only a couple of effective pharmaceuticals (most notably for diphtheria, tetanus and syphilis).
The role of some major factor other than medical treatment is even more clear if you break out the mortality rates by specific diseases. In 1900, the most deaths came from tuberculosis, influenza/pneumonia, and gastroenteric diseases such as dysentery. [1][4][5] All of these were effectively conquered by antibiotics in the 1930s and ’40s, but were on the decline since at least the beginning of the century. Some charts that illustrate this in the US: [4]
A similar chart for England & Wales [5] (“chemotherapy” here means antibiotics):
Indeed, digging further into the UK data from the late 1800s, we can see that TB was declining since at least 1850 and gastroenteric disease since the 1870s [6]. And similar patterns hold for lesser killers such as measles, which didn’t have a vaccine until the 1960s, but which by then had already declined in mortality by more than 90% from its 1900 levels.[1][4]
So what was going on? If you read my survey of technologies against infectious disease, you know that other than drugs and immunization, there is one other way to fight germs: cleaning up the environment.
I was surprised to learn that sanitation efforts began as early as the 1700s—and that these efforts were based on data collection and analysis, long before a full scientific theory of infection had been worked out. James Riley, in “Insects and the European Mortality Decline”, writes: [3]
In the later decades of the seventeenth and early decades of the eighteenth century, a number of internationally renowned physicians … formulated specific measures of intervention. Relying on Hippocratic tradition, specifically, on its suggestion that endemic and epidemic diseases are caused by forces in the environment, and influenced by Renaissance efforts at urban sanitation, these physicians proposed to discover the meteorological and topographical forces that might be blamed for the onset of epidemics. Toward this end, they and their followers embarked on a vast campaign to assemble qualitative and quantitative data about epidemics, climate and weather, geographical and topographical signs, and other features of the habitat. Their aim was to find conjunctures or correlations in the data, occasions when epidemics occurred after the same complex of environmental forces. Early signs of such a complex would offer warnings and allow the adoption of measures of prevention and avoidance. This body of medical theory failed to produce a coherent list of correlations, but it did provide a specific body of measures of avoidance and prevention.
In particular, they proposed (each bullet quoted from the article):
- to drain swamps, bogs, moats, and other sites of standing water
- to introduce hydraulic devices that would circulate water in canals and cisterns
- to flush refuse from areas of human habitation
- to ventilate living quarters and meeting places and to burn sulfur sticks or apply other insecticidal measures in houses, hospitals, prisons, meeting halls, and ships
- to inter corpses outside the city
- and by other measures, including refuse burial, to detach humankind from organic waste
These reforms were implemented starting in the 1740s, some by local and central governments, others by “humanitarians acting on private initiative”.
What broad changes were actually implemented, and is it plausible that they had a significant impact?
To have had a significant effect on insect numbers, the measures proposed by the environmentalists [the physicians advocating environmental cleanup] would have had to have been broadly applied across western Europe. Two measures, lavation and drainage, are particularly important in insect control, and we can focus on examples of their application. Lavation combines programs taking three forms: flushing filth from urban sites, collecting and disposing of refuse, and introducing devices to agitate or circulate standing water. By these means, which would cleanse streets, industrial sites, and buildings, and transform standing water in canals and cisterns into moving water, the environmentalists argued, the city might be made as healthy as the countryside. One model for these proposals was the naturally washed site of the town of Chester, England, where rain periodically flushed refuse into a subterranean drainage network cleansed by tidal action. The objective of the environmentalists was to introduce the same action by hydraulic engineering. Another model was the program followed in Hamburg to collect and dispose of refuse outside the city each day. A third was the improvement of streets by paving and widening, and of urban drainage networks by constructing or expanding sewage systems. Measures of one or another variety were adopted in many British cities and towns in the Improvement Acts of the 1760s and thereafter, and observers, such as William White in York, attributed declines in mortality specifically to them. In Paris, the drainage system was improved in 1740, and later in the century, other measures, including the emptying of cesspits and the installation of sewers, followed. In the Austrian Empire, Johann Peter Frank directed a broad campaign of medical policing, which included projects for refuse collection and disposal.
These efforts affected not only diseases such as malaria, where insects are the primary vector of infection, but also others such as dysentery in which insects (especially flies and cockroaches) can distribute the disease throughout the environment, e.g., from waste to food. Pest control thus provides the best explanation I’ve found for reductions in the mortality rate from the mid-1700s to early 1800s.
After that point, there were some significant shifts in the population and in disease.
As insects became better controlled in the countryside, people began migrating in greater numbers to cities, which worsened disease due to crowding and polluted water. Malaria was on the wane; cholera was on the rise. The mortality rate, after declining for decades, actually plateaued in the mid-1800s, [3] and mortality rates were higher in cities than in the country. [7]
The worsening conditions in the increasingly crowded cities caught the attention of sanitary reformers throughout Europe, such as Edwin Chadwick in Britain and Max von Pettenkofer in Germany, who campaigned in particular for improvements in water and sewage. Starting in the mid-1800s, cities in Europe and the US sought cleaner sources of water, further upstream, or from less polluted rivers. The water was piped into homes, which reduced reliance on wells and surface water. Later, many cities built water filtration systems—the earliest method was simply to allow the water to pass through sand. They also built or modernized sewer systems to transport human waste outside the city and dump it downstream or into the sea, instead of allowing it to collect in pit latrines or cesspools in town, or run through the streets or in open trenches. And crucially, they made sure to keep these systems separate, so that sewage didn’t contaminate drinking water.[7][8]
These early efforts, however, were sometimes for aesthetics as much as they were for health, and even to the extent they were aimed at health, they could only be guided by smell, taste and color. [9] But in the late 1800s, the germ theory of disease was established by scientists including Louis Pasteur and Robert Koch. By the 1880s, specific bacteria had been identified as the cause of certain diseases, such as tuberculosis and cholera. Along with this came techniques to grow bacteria in culture and to identify them under the microscope.
These new ideas and methods gave sanitation efforts new tools and targets: instead of aiming to improve sensory qualities, they could aim to eliminate harmful bacteria. Filtration was improved, and chlorine was added to kill germs—first in drinking water, and then in sewage itself. [10]
Cutler & Miller estimate [10] that
the introduction of water filtration and chlorination systems led to major reductions in mortality, explaining nearly half of the overall reduction in mortality between 1900 and 1936. Our results also suggest that clean water was responsible for three-quarters of the decline in infant mortality and nearly two-thirds of the decline in child mortality. The magnitude of these effects is striking. Clean water also appears to have led to the near eradication of typhoid fever, a waterborne scourge of the 19th and early 20th Centuries.
They chart mortality from typhoid in several major cities, of which Pittsburgh is the most dramatic:
Food handling improved as well. Milk is a case in point: In England, as of the late 1800s, milk was transported warm in open containers, making it a literal breeding ground for tuberculosis and other bacterial diseases. Pasteurization was introduced around 1900, along with sealed tins and bottles for transporting and storing milk. Condensed and evaporated milk also became popular around this time, and since these products were sterile they also reduced diseases. All these innovations contributed to the rapid decline in infant mortality seen here: [11]
Finally, the germ theory led to public health efforts to educate the populace on good general hygiene. A policy brief by Samuel Preston says: [9]
Enlightened public health officials were quick to recognize how the germ theory should guide their practice. Furthermore, by the time of the first White House Conference on Infant Mortality, held in 1909, they realized that rapid advances in longevity required that public officials go beyond their normal domain of public works and attempt to change the personal health practices of individuals. The germ theory provided a number of powerful weapons for doing so. These included boiling bottles and milk, washing hands, protecting food from flies, isolating sick children, and ventilating rooms. Public health officials launched massive campaigns to encourage these practices. In New York City, milk depots were established with the ostensible purpose of distributing milk to indigent mothers but with the real purpose, according to the director, of instructing mothers in hygienic practices. The New York City Department of Health produced one of the nation’s first motion pictures, entitled The Fly Pest. At the national level, the new Children’s Bureau adopted a primary focus on child health. Its pamphlet called Infant Care became the largest selling volume in the history of the Government Printing Office, with some 12 million copies sold by 1940. By the 1920s, the bureau was receiving and answering over 100,000 letters a year from parents seeking child care advice.
Thus the germ theory, long before it led to medical treatments, drove down mortality rates by revolutionizing sanitation and hygiene.
So the mortality data points to a large and easy-to-underappreciate role of pest control, water sanitation, food handling, and general hygiene.
However, it also shows the extreme effectiveness of antibiotics, when they were finally invented. In the US, during the golden age of antibiotics from 1937 to ‘52, the infectious disease mortality rate fell by 8.2%/year, compared to an average of 2.8%/year during 1900–36 and 2.3%/year during 1953–80. [1] The inset in the following chart illustrates this by overlaying two exponential curves representing the two slower declines before and after; the antibiotics era is in between, where the blue line rapidly falls from the higher curve to the lower:
Just based on this timing, it’s not unreasonable to estimate that antibiotics alone were responsible for a decrease in the mortality rate of something like 5.4%/year for 15 years, or an overall reduction of over 56%.
What about vaccines? Vaccines eradicated smallpox, which was a leading cause of death in the 1700s, although we can’t know precisely how much it contributed to 18th-century mortality, since the disease had been greatly diminished by the time causes of death were widely and reliably recorded. A vaccine was also the solution for polio, which caused few deaths compared to many other diseases, but many cases of paralysis. For other diseases, though, vaccines mostly came along late in the game and mopped up what was left after improved sanitation and hygiene had done most of the work.
The other reason that vaccines don’t rack up as many points is that, by coincidence, they weren’t a good fit for the most common and lethal diseases. The flu has too many strains, and mutates too fast, for a highly effective vaccine; the BCG vaccine for tuberculosis also has varying efficacy (for reasons still not fully understood [12]). Pneumonia and gastroenteric diseases are opportunistic infections that can be caused by any of multiple types of germs; a vaccine only protects against specific germs, and so the vaccines we have for these diseases can only target the most common causes.
However, judged by effectiveness, vaccines score very well, having reduced morbidity for several important diseases by over 99% [13] (see original for notes and caveats):
Disease | Baseline morbidity | 1998 morbidity | % decrease |
---|---|---|---|
Smallpox | 48,164 | 0 | 100.0% |
Diphtheria | 175,885 | 1 | 100.0% |
Pertussis | 147,271 | 6,279 | 95.7% |
Tetanus | 1,314 | 34 | 97.4% |
Poliomyelitis (paralytic) | 16,316 | 0 | 100.0% |
Measles | 503,282 | 89 | 100.0% |
Mumps | 152,209 | 606 | 99.6% |
Rubella | 47,745 | 345 | 99.3% |
Congenital rubella syndrome | 823 | 5 | 99.4% |
Haemophilus influenzae type b | 20,000 | 54 | 99.7% |
In my previous survey of anti-disease technologies, I also mentioned antiseptics and sterilization. Where do these techniques show up? Primarily in the hospital, it turns out. These techniques are critical for surgical and maternal mortality, reducing post-operation infections and childbed (aka “puerperal”) fever. But diseases of surgery and childbirth are relatively rare compared to those of everyday life, and so they don’t figure prominently in overall mortality rates.
The bottom line is that sanitation—pest control, water filtration and chlorination, safe sewage disposal, milk pasteurization and other food safety, and public education about general hygiene—probably did more than anything else to reduce mortality rates, if only because these techniques were available decades, and in some cases centuries, before anything else. Antibiotics were dramatically effective when they were finally introduced, but by this point a lot of the work had already been done. Vaccines too were extremely effective, but merely delivered the coup de grace for many diseases. Other techniques, while very important in limited spheres, simply addressed problems that were too small to show up on any of the top lists.
What strikes me about all this is a similar pattern to what I’ve previously written on science & the Industrial Revolution:
- A naive or cursory look at the history gives a simplistic account: Medicine reduced disease! Science saves lives!
- A closer look reveals that disease mortality was dropping long before antibiotics or vaccines. So (some hastily conclude) medicine didn’t really matter after all—so much for better living through science!
- An even closer look shows that actually, the germ theory led to sanitation and hygiene improvements decades before we had specific treatments. So, as with the steam engine, it turns out science was relevant, just not in the obvious first place one might look.
- Finally, the galaxy-brain take looks not only at direct influences but indirect/cultural ones: The Scientific Revolution led to new ways of experimenting and collecting/analyzing data that led to practical improvements (in waste disposal and insect control) long before we had a fundamental scientific theory.
Thanks to Tyler Cowen, Matt Bateman, Andrew Layman, Sean Pawley, Ben Landau-Taylor, and Michael Goff for reviewing drafts of this post.
References
- Armstrong, G. L., Conn, L. A. & Pinner, R. W. Trends in Infectious Disease Mortality in the United States During the 20th Century. JAMA 281, 61–66 (1999).
- McKeown, T., Brown, R. G. & Record, R. G. An interpretation of the modern rise of population in Europe. Population Studies 26, 345–382 (1972).
- Riley, J. C. Insects and the European Mortality Decline. The American Historical Review 91, 833–858 (1986).
- Cutler, D. & Meara, E. Changes in the Age Distribution of Mortality Over the 20th Century. http://www.nber.org/papers/w8556 (2001) doi:10.3386/w8556.
- McKeown, T., Record, R. G. & Turner, R. D. An interpretation of the decline of mortality in England and Wales during the twentieth century. Population Studies 29, 391–422 (1975).
- McKeown, T. & Record, R. G. Reasons for the decline of mortality in england and wales during the nineteenth century. Population Studies 16, 94–122 (1962).
- Szreter, S. The Importance of Social Intervention in Britain’s Mortality Decline c.1850–1914: a Re-interpretation of the Role of Public Health1. Social History of Medicine 1, 1–38 (1988).
- Burström, B., Macassa, G., Öberg, L., Bernhardt, E. & Smedman, L. Equitable Child Health Interventions. American Journal of Public Health 95, 208–216 (2005).
- Preston, S. H. American Longevity: Past, Present, and Future. (1996) doi:10.2139/ssrn.1824586.
- Cutler, D. M. & Miller, G. The Role of Public Health Improvements in Health Advances: The 20th Century United States. http://www.nber.org/papers/w10511 (2004) doi:10.3386/w10511.
- Beaver, M. W. Population, Infant Mortality and Milk. Population Studies 27, 243–254 (1973).
- Dockrell, H. M. & Smith, S. G. What Have We Learnt about BCG Vaccination in the Last 20 Years? Frontiers in Immunology 8, 1134 (2017).
- CDC & Bonanni, P. Achievements in Public Health, 1900–1999: Impact of Vaccines Universally Recommended for Children–United States, 1990–1998. Demographic Impact of Vaccination: A Review 48, 243–248 (1999).
https://www.goodreads.com/book/show/57906379-extra-life tackles similar questions and claims fertilizers, sanitation, and vaccines saved billions of lives each:
https://www.workersliberty.org/story/2022-01-25/20000-days-history-life-expectancy