|Home | About | Journals | Submit | Contact Us | Français|
Without the modern tools of surveillance, or the ability to develop a national vaccination campaign, local health departments were often on their own in preparing and combating the spread of the disease during the influenza epidemic of 1918. This article reviews the state of public health before the epidemic, seeking to place the reaction to the disease in the context of the evolution of public health. The epidemic struck at a critical time in the history of the nation and of public health, and we must explore not only the tools and technologies that were available to practitioners at the time, but also the authority provided by local and state public health practitioners to apply these tools. Much of public health was rooted in the experiences and practices developed over the previous century in responding to often dramatic outbreaks of cholera, yellow fever, typhoid, and a host of other infectious diseases.
The order from the Archdiocese of Chicago was clear: “In compliance with the directions of the state and city Departments of Health,” it began, “. . . all evening services are suspended.. .. The faithful may visit the Church during the day for private devotion,” the directive continued, “but there are to be no public devotions in the afternoons or evenings.” No masses were to last more than 45 minutes, long sermons were prohibited, and Sunday instruction was to take no more than five minutes. Between Masses, “. . . the church is to be thoroughly ventilated for ten to fifteen minutes while the people are out of the building” and was to be cleaned with disinfectants; parishioners who coughed, sneezed, or were “showing indications of having contracted the disease” were asked to leave. Confirmations and other ceremonies were suspended. Finally, the church members were asked to “. . . commend to the prayers. .. particularly the children, the speedy recovery of all those on whom sickness has laid its hand and the early termination of this epidemic.”1 The church was not alone in drastically altering its practices and routines during the epidemic of “Spanish flu” in the fall of 1918. Throughout Chicago and the nation, state and local health departments promoted voluntary and compulsory activities to lessen the impact of the epidemic. In many cities, new movie theaters and vaudeville halls were closed or dramatically altered their schedules. Work schedules were staggered to lessen crowding on public transportation. Ordinances were passed to limit spitting. In Chicago, Health Commissioner John Dill Robertson sought to educate the public about the new sanitary practices dictated by the recent discoveries of bacteriology: “Many people cough or sneeze in their hands; then, without washing, shake hands with others, thus passing to their friends the germs they had in their noses and throats.”2
This article reviews the state of public health before the influenza epidemic of 1918, seeking to place the reaction to the disease in the context of the evolution of public health in the previous two centuries. The epidemic struck at a critical time in the history of the nation and of public health and, to grasp the meaning of the epidemic, we must explore not only the tools and technologies that were available to practitioners at the time, but also the authority provided by local and state public health practitioners to apply these tools. Much of public health was rooted in the experiences and practices developed over the previous century in responding to often dramatic outbreaks of cholera, yellow fever, typhoid, and a host of other infectious diseases.
By the time the epidemic struck, a number of characteristics of the American response to disease had emerged. First, the use of sanitarian techniques of cleaning and altering the physical environment through street cleaning, removal of refuse, provision of clean water, removal of waste through public sewer systems, removal of night soil and regulation of privies, and inspection of milk and other food products had been established as legitimate activities of the state through local health departments. Further, the identification of the ill through surveillance, voluntary and legally enforced quarantine, or isolation had also gained widespread legitimacy. Finally, the growing authority of the expert and the state gave the public health department a new status that allowed it to institute measures that in earlier periods of American history might have faced stiff resistance from individuals, commercial enterprises, and community institutions.
Despite a lack of understanding of the virology of the epidemic or its biological “cause,” public health practitioners were active in finding means to limit the impact of the disease. In fact, the activities of the practitioners on the ground—quarantine, isolation, public propaganda, warnings, anti-spitting campaigns, legal restrictions on commercial activities, inspection, surveillance, and mandated (often public) identification and (perhaps) stigmatization—were all employed. With no ability to see the virus and no vaccines available to prevent its spread, the public health community's ability to fight the epidemic depended on its moral, political, and legal authority.
To understand the response made to the epidemic, we need to situate it within the context of America's experience with disease. We begin by tracing the broad development of public health practice and the epidemiology of disease in urbanizing and industrializing America.
In the United States, as elsewhere, the health of urban populations has been shaped by the shifting nature of the country's economic, social, and political life. From early in U.S. history, health status and social development have been intimately connected. In the 16th and 17th centuries, most Americans lived in rural areas. Many studies of Colonial New England written in the 1970s reveal an extraordinarily successful experience with disease as measured by available statistics on average length of life, population growth, infant mortality, and other measures. According to Philip Greven3 and other colonial historians, men living in the first Andover settlement and elsewhere lived into their sixties, seventies, and eighties.4 The white population increased from fewer than 25,000 in 1700 to more than 5.3 million a century later as infant and childhood mortality plummeted.5
In Virginia, by contrast, yellow fever and malaria, both mosquito-borne diseases, contributed to high morbidity and mortality. While these diseases were both widely reported in 17th-century New England, their impact on the colonists of Jamestown was inordinately greater. The first Virginia colonies were plagued by starvation that led to susceptibility to malaria, yellow fever, and other epidemics. Early death, constant infirmity, and infertility marked the experiences of these settlers. The differing experiences with disease in New England and Virginia were probably related to the distinct social and economic bases for these colonies. The New England colonies were settled by families seeking to establish stable, self-sufficient communities based on sustainable agriculture. The Virginia colonies, in contrast, were settled largely by a male population. The dearth of women among the first generations of colonists, and their inability to establish successful economic and social institutions, led to an inability to avoid early death and susceptibility to epidemic diseases.6–8 According to Herbert Klein, “Although figures are partial. .. there now seems to be general agreement that the crude death rate in the Southern Colonies in the 17th century was in the range of 40 deaths per 1000 residents, compared with a range of 20 deaths per 1000 in rural New England for a population of roughly the same age structure.”5
Similarly, differing populations' experiences with health and disease reflected the social circumstances that defined their position. As David Jones has noted, “. . . the decimation of American Indian populations that followed European arrival. .. was one of the most shocking demographic events of the last millennium,” with the Native American population declining “. . . by as much as 95 percent. . . .” But, in contrast to others who argue that this was a seemingly inevitable event brought on by the exposure of “virgin” populations to smallpox and other previously unknown diseases, Jones argued that without acknowledging the role of European expansion, we naturalize what is otherwise a very unnatural disaster.9 The destruction of the Native American population in the Northeast has been linked to their vulnerability in the face of massive imperial expansion of Europeans, while the high mortality rates among slaves subjected to the Middle Passage were clearly linked to the massive involuntary subjugaton of Africans. Likewise, the relationship of the imperial experience to extraordinary mortality in the Caribbean is well documented.10
The early experience with disease was altered by the transformation of the economy, transportation, and demography during the 18th and early 19th centuries. By the end of the 18th century, an extensive commercial economy combined with a growing, increasingly urbanized and poor population to make epidemic diseases a much greater threat. Epidemic disease, once a local phenomenon circumscribed by the relative lack of mobility among self-sufficient and isolated rural communities, began to sweep through the nation along the well-established trade routes as the nation's boundaries and population expanded, displacing local, relatively isolated farming economies that had predominated earlier.7,11 By the middle of the 19th century, the highly crowded and increasingly poor cities experienced death rates that were as high as those of European cities.12
Waterborne and airborne diseases began to sweep through poor communities in the growing cities and ports of the nation.13 Cholera, a disease that causes severe dehydration through acute diarrhea, had a dramatic and fearsome impact in cities from New Orleans to New York and Boston in 1832, 1848, and 1865.14 In the absence of sewer systems, pure water, systematic street cleaning, pure or fresh food or milk, and decent methods for preserving or freezing meats, diphtheria and whooping cough, typhoid, typhus, and any number of fevers and influenzas became constant threats to babies and young children in the filthy urban trading centers of the nation.15,16 By the second half of the century, death and disease rates in U.S. cities had increased substantially and Americans' average length of life was by then no better than that of Europeans.11
Along with crowding came a decided decline in the quality of life for many. In the nation's growing cities, it is estimated that there were as many as one horse for every 10 to 20 residents, and each horse deposited between 30 and 50 pounds of manure and two quarts of fresh urine a day on city streets.17 In the largest cities, this meant literally thousands, and in New York, hundreds of thousands of horses carrying people, goods, and construction materials into and out of towns. As the railroad systems spurred the centralization of commerce in the cities of the East Coast and urban economies became dependent on moving goods from terminals and ports to distribution points and storage facilities, the horse population grew dramatically. By the second half of the century, according to public health reports from around the country, thousands of dead horses, goats, pigs, and cattle lay imbedded in uncollected filth, often for days and weeks. The streets of Boston, Chicago, New York, New Orleans, and other growing communities were filthy with accumulations of manure from the horses that traversed the area, as well as dead dogs, cats, and rats, and household and vegetable refuse. In some cities, public health officials estimated that in winter, refuse accumulated to depths of two to three feet.18
Every aspect of a city's life was affected by the disruption of the urban environment. Moving around streets that rose between two and three feet during the winter months because of the accumulation of thousands of pounds of organic waste was immensely difficult. The wealthy could sometimes escape epidemics by moving to enclaves farther from the commercial heart of cities, into housing with high stoops that allowed them to enter their houses on the second floor, literally rising above the filth. But the poor had little choice but to stay downtown, often in dank basement dwellings, close to the commercial districts, ships, and ports that provided them with jobs.18 Throughout the 19th century, the once relatively integrated urban environments where shopkeepers lived above their workplaces and laborers lived in or near their homes, were replaced by cities segregated by class, ethnicity, and race.19,20 Merchants and physicians alike complained that “pestilential diseases” laid bare “the impotence of the existing sanitary system.”21 Disease, increasingly, could be measured in dollars and cents. Infectious disease not only devoured the lives of the poor, but also threatened anyone who had a maid, employed a laborer, or came in contact with people on the street.18 The public health model fundamentally accepted the inevitability of the broad social and economic disruptions. But it sought ways to reconcile the empirical relationship between disease outbreaks and the diseased by imposing order through isolation, quarantine, and physical distancing of the healthy and sick.
The attention paid to the “conditions of the poor” was not new in America's cities. In Boston in 1850, Lemuel Shattuck published a devastating portrait of the health of the poor, “Report on the Sanitary Commission of Massachusetts.”22 In New York City, John Griscom publishedSanitary Condition of the Laboring Population of New York (1845).23 But by the post-Civil War period, there was a generalized sense that poor health was becoming a permanent aspect of urban life and that the poor were a threat to others as well as themselves. Not only was communicable disease obviously spreading between classes, but those most susceptible now appeared the most physically threatening. According to a report by the Citizen's Association of New York, “The mobs that held fearful sway in our city during the memorable out-break of violence in the month of July, 1863, were gathered in the overcrowded and neglected quarters of the city.. .. The high brick blocks and closely-packed houses where the mobs originated seemed to be literally hives of sickness and vice.. .. It was wonderful to see, and difficult to believe, that so much misery, disease, and wretchedness can be huddled together and hidden by high walls, unvisited and unthought of, so near our own abodes.. .. To walk the streets as we walked them, in those hours of conflagration and riot, was like witnessing the day of judgement, with every wicked thing revealed, every sin and sorrow blazingly glared upon, every hidden abomination laid before hell's expectant fire.”18
Amid this atmosphere of alarm over the “conditions of the poor,” civic leaders throughout the nation launched major investigations into the social, environmental, and individual causes and consequences of disease, and began to pave the way for the public health movement and its advocacy of improved sanitation in the United States. In Chicago, social reformers in Hull House focused on living conditions as the reason for the declining health and well-being of workers, women, and children. In Boston, charity workers looked at the slums in which the Irish lived as the cause of social unrest and the spread of disease. In Philadelphia, New York, and Boston, reformers focused on housing as a cause of the city's physical, social, and moral decline. The observation that housing, politics, morals, and health were all intertwined underscored reformers' perceptions of what needed to be done in the coming decades.
Underlying the social geography of disease documented by sanitary investigations were urban economic development and land use patterns that had created some of the world's worst crowding and most depressing health statistics. Although epidemics were relatively minor contributors to overall death rates, the highly visible and often dramatic experience of seeing people literally dying in the streets had an enormous impact, affecting where and how cities developed. In the late 18th century, yellow fever had caused elites to flee from cities to relatively distant suburbs, beginning a spatial segregation of the rich and poor that would develop over the next century.11,24
The “sanitarians” who led reform efforts generally saw themselves as more than technical experts or professionals trained in a specific skill. Some had come from elite merchant families and others had been trained in the ministry.23 They defined their mission as much in moral as in secular terms and believed that illness, filth, class, and disorder were intrinsically related. Individual transgression and social decay were equally at fault for poor health, they posited. In this period before the widespread acceptance of the notion that there was a specific pathogen for a particular disease entity, public health workers, medical practitioners, and lay people alike understood disease in highly personal and idiosyncratic terms. Much of public health practice as well as medical therapeutics rested on the belief that disease was a reflection of individuals' special social, personal, hereditary, and economic circumstances. Maladies were based, in part, on the peculiarities of the individual and his or her life. The special relationship between an individual and a complex, highly particularized environment was at the root of illness.25
With the turn to bacteriology that followed the discoveries of Louis Pasteur, Joseph Lister, and Robert Koch in later decades of the 19th century, a new faith in laboratory science emerged among physicians and public health workers. “Bacteriology thus became an ideological marker, sharply differentiating the ‘old’ public health, the province of untrained amateurs from the ‘new’ public health, which belonged to scientifically trained professionals,” Elizabeth Fee points out.26 Despite the different professional mandates of public health workers and physicians, those who identified themselves with the science of medicine and public health began to share a common faith in the significance of the disease-specific germ entity. A new model was gaining greater acceptance: a bacillus made people sick. The slums of large cities were “breeding grounds” that were “seeded” with tuberculosis bacilli waiting to infect the susceptible victim. Tuberculosis was understood as a disease that could be transmitted to susceptible individuals by means of air impregnated with bacteria from dried sputum or breathing. The dusting of furniture could throw into the air the “dried sputum” of tuberculars. Crowded public spaces and unclean homes with moist, warm, and stagnant air were seen as the most likely conduits for the disease.27,28
A vibrant urban reform effort fundamentally transformed housing conditions for city dwellers in the decades at the turn of the century.29 New housing was more likely than not to have indoor plumbing and connections to the new water and sewer lines that were replacing wells and privies.30 Tenement laws were passed that mandated that all rooms in newly constructed buildings have windows that opened to the outside so that relatively fresh air and light could filter in. Restrictions on housing density and new nuisance laws that required property owners to keep their properties clean began to have an effect on rates of tuberculosis and other devastating diseases. Laws requiring the refrigeration of food stuffs, and meat-inspection laws, pure-milk laws, and the like began to be reflected in improvements in overall health, and with these changes came a decline in the prevalence of infections of all kinds. Outbreaks of smallpox and influenza still occasionally hit communities, but (with the exception of the 1918–1919 epidemic) seemed to wane in intensity and duration.31 Former scourges such as cholera and some diarrheas subsided as major threats during summertime.31,32
Despite decades of agitation, however, the nation still faced daunting environmental hazards and public health practices remained essentially the same, even as the germ theory gained intellectual cache. In Richmond, Virginia, for example, a survey of the city revealed that as late as 1908, there were still an estimated 5,000 houses without sewers or public water supplies. Backyard “dry closets” were ubiquitous throughout the city and the sporadic, uneven, and prohibitively costly removal of human waste meant that it often overflowed into common areas where children played, laundry was washed and hung, and wells for drinking water were located. Continued reliance on the horse for the transport of goods and people, standing pools of water where mosquitoes bred, and animals that roamed the streets all posed serious threats to the health of the population itself. Further, while the health department was developing inspection systems to guarantee the safety of the food chain, farmers sought to avoid regulations to lower their costs. For example, “swill,” the waste materials from local breweries, was fed to cows, leading to impure milk, sometimes “thickened” with chalk powder, through which a host of diseases could be carried.33 In 1912, for example, New York's Public Health Department issued an annual report that, in dispassionate language, detailed the continuing environmental problems that New Yorkers faced. The Department of Health picked up more than 20,000 dead horses, mules, donkeys, and cattle from the city's streets during the year and recorded 343,000 complaints from citizens, inspectors, and officials about problems ranging from inadequate ventilation and leaking cesspools and water-closets to unlicensed manure dumps and animals kept without permits. It also removed nearly half a million smaller animals, such as pigs, hogs, calves, and sheep.34,35
While such environmental hazards had by then become familiar, somewhat startling to officials was the emergence of changing patterns of death in the city. The infectious diseases of the 19th century, such as smallpox, typhoid fever, diphtheria, and pulmonary tuberculosis, appeared to be claiming fewer and fewer children and young adults. But cancer, heart disease, and pneumonia were claiming larger and larger numbers of elderly. To public health officials, these findings were significant because, on one hand, they showed measurable progress in the battle against infectious diseases. But on the other hand, the statistics suggested the need to broaden the focus to reduce mortality from diseases increasingly associated with middle and old age and industrialization. Vaccination against smallpox, diphtheria antitoxin, and Salvarsan were harbingers of a future that promised medical interventions that could eliminate the old scourges of epidemic disease. Bacteriology held the promise that the discovery of a particular pathogen could lead to a specific medical intervention that would cure the disease by destroying the agent that caused it.26 The early 20th century began with enormous optimism about personal interventions that could eliminate the age-old scourges of the human race.
If the 19th century was marked by an intensive use of land and a concomitant crowding in tenements of poor, often malnourished, and powerless immigrants in the cities of the East Coast, the new century saw fundamental change in land use and transportation that improved health in many respects but created new hazards and new diseases for urban America. In particular, the rapid growth of the Midwestern industrial cities in the late 19th and early 20th centuries created distinct housing patterns and new types of problems that, during the latter decades of the century, preordained the creation of chronic diseases. Exposures to toxins and synthetic materials; an increasing life-span; air, water, and soil pollution; and the creation of a huge marketing industry that promoted toxic materials for consumer uses—such as lead paints and tobacco—led to an epidemiological revolution as older infectious diseases declined in importance and major killers and new chronic conditions emerged. If epidemics were a hallmark of the crowded, centralized cities of the East Coast during the 19th century, then cancers and other chronic conditions became the paradigmatic conditions that plagued those in the 20th century.36
Better nutrition, housing reforms, the introduction of pure water supplies and sewer systems, and improvements in street-cleaning technology led to a generally cleaner, more sanitary urban environment for children. The horse was replaced by the electric streetcar and trolley in the 1890s and the automobile in the early 1900s, leaving the city streets looking and smelling better. The numerous granaries needed for the maintenance of hundreds of thousands of horses in urban communities began to disappear, making it easier to control the huge rat and rodent problem that was linked to the spread of lice and tick-borne diseases.32 Similarly, the creation of public health stations that provided pasteurized milk, settlement houses that provided emergency shelter, visiting nurses, and educational programs for mothers and their children also improved the chances of childhood survival. The development of maternity hospitals as well as pediatric and foundling hospitals further improved conditions for children.37 While infectious diseases continued to be important problems, particularly for the poor living in large cities, the causes of death as noted in various reports from the first half of the 20th century slowly shifted to chronic conditions, many of which appeared associated with new lifestyles and the new industrial and consumer economies.
The transformations occurring in public health departments were paralleled by a transformation of the political institutions that governed an increasingly urbanized America. By the turn of the 20th century, the United States had emerged as among the most highly industrialized countries in the world. From an agrarian nation only a few decades before, about half of the nation's population now resided in cities. The Progressive movement, the broad political reform effort that sought to “clean up” urban politics and to create a more efficient and effective political bureaucracy, saw in the health department a potential model for professional control of critical public functions, from street cleaning to inspection of the food and water supply. Hence, the decades before the influenza outbreak were marked by dramatic improvements in the nation's health and, with it, a growing stature for medical professionals.32 Ironically, the successes of public health officials in stemming disease soon fed their decline in status as clinical medicine gained in stature and public health practitioners found themselves increasingly marginalized within the political system. The epidemic struck at the zenith of public health authority.
Between the years 1880 and the epidemic, the death rate from tuberculosis was cut in half, the death rate from diphtheria declined substantially, and the effects of cholera, yellow fever, malaria, smallpox, and a host of other dreaded diseases were reduced substantially and even vanished from some communities. The bacteriological “revolution” created a practical and ideological struggle over disease, how it would be addressed, and who would be in control. Those in the forefront of this new “scientific” model saw in it a means of avoiding the longstanding dangers of older sanitarian models that linked disease to the special circumstances of poverty, poor housing, social dislocation, and even capitalist exploitation. For others, primarily elite practitioners, the improvements in health status reflected the new understanding of the nature of disease and, specifically, the germs that produced it.26 But for others, public health's success was rooted in its alliance with social reform movements, citizen action, and practical environmental reformation of living conditions, child labor laws, regulation, and inspections. Public health practitioners' tools were empirical, based on a long history of experience with contagions and the means to address the perceived relationships among poverty, filth, and disease spread.38
The history of disease and public health was reflected in the activities surrounding the influenza epidemic of 1918. At first, many administrators sought to allay concern by denying there was any reason to worry. In Minneapolis, for example, the city's health commissioner, Dr. H.M. Guilford, calmly pointed out that “Spanish influenza does not exist in Minneapolis and never has,” while at the same time noting that “. . . it probably will reach here during the fall.”39 Similarly, in Newark, New Jersey, local officials initially seemed unconcerned, despite the fact that Fort Dix, one of the epicenters of the disease, was close by.40 Even in New York City, home of the most prestigious of local health departments, the initial reaction appeared aimed at calming, rather than alerting, the public about the possible dangers from the approaching epidemic. After “. . . Eleven more cases of Spanish flu, or whatever it is, were reported at Quarantine yesterday,” The New York Times announced, “. . . the board proclaimed that ‘this was done as a precautionary measure and not because the board believed there was the slightest danger of an influenza epidemic breaking out in New York.’”41 Public health officials tried to reassure the populous: “Many thousands of cases resembling influenza have occurred in Spain and Germany,” the department spokesman reported. But “. . . on this side of the Atlantic. .. not a single death resulted.”42 In Atlanta, Boston, and Philadelphia—all cities that would experience the impact of the disease acutely—officials seemed intent on assuaging fears as they prepared to combat what would be a swift-moving, but devastating disease.43
While calming the public seemed to be the first official reaction, in fact, many communities did initiate efforts to stem the disease, albeit with differing degrees of enthusiasm and initiative. The Minneapolis health department quickly isolated patients in the City Hospital by suspending visitors in general and establishing an isolation ward for influenza patients more specifically. As the epidemic approached, a de facto division of care developed as the private hospitals took the non-influenza patients and the City Hospital became the influenza center. Further, the health officer requested the city's physicians to begin a voluntary program reporting suspected cases and asked those stricken by the disease to voluntarily avoid contact with others. More generally, as happened in many communities around the country, public health authorities called for voluntary efforts to avoid crowds and public gatherings.44,45 Across the river in St. Paul, city authorities depended less on voluntary compliance and more on the power of the state: plans were made to close schools, churches, and places of amusement.46
By September, outbreaks of influenza among U.S. Army and U.S. Navy personnel were reported by newspapers in Boston, Chicago, Newport (Rhode Island), and Detroit, and among the public in ports and cities throughout the country. By mid-September, 30,000 cases were reported nationwide and the epidemic had visited 26 states.47
In New York, Herman M. Biggs and William H. Park developed public health laboratories to produce vaccines and antitoxins to control smallpox, diphtheria, and other infectious diseases while going to court to incarcerate Mary Mallon, widely known as “Typhoid” Mary for nearly three decades. In Massachusetts, the department of health developed screening programs to identify bacterial contamination of the milk and water supply while simultaneously passing factory regulations aimed at forcing manufacturers to develop safer and cleaner workplaces. And in public health texts and classrooms in the new schools of public health and hygiene that were being organized in Baltimore, Boston, and New York, public health students learned of new staining techniques used to identify strains of bacteria while at the same time learning how to install water pipes and septic tanks.
Public health crusaders such as Charles Chapin of the Providence, Rhode Island, Department of Health, and Haven Emerson of Columbia University published widely, calling for a new orientation that would transform public health practice into a new, efficient, science- and laboratory-based profession. In Chicago, Chapin spoke to the public health community a few months before the epidemic first struck and laid out new models of scientific and economic efficiency that “. . . measured the importance of various health department activities, and. . . the money and effort which should be directed to each.” Every health officer, he pronounced, “. . . is confronted with the problem of what is worthwhile and what will bring the greatest returns for the money and effort expended. In other words, what are the fundamentals.” Chapin argued that progressive thought demanded an efficient public health establishment that focused on finding the germ, not reforming the environment that made people sick.2
For Chapin and many others, the fundamentals had been transformed over the previous two decades from sanitation and environmental control to bacteriology and point-specific interventions aimed at identifying the sick and controlling the infectious. Hibbert Hill, a public health official and author, summarized the new rhetoric of public health in his provocative book, The New Public Health: “The old public health was concerned with the environment; the new is concerned with the individual. The old sought the sources of infectious disease in the surroundings of man; the new finds them in man himself. The old public health sought these sources in the air, in the water, in the earth, in the climate and topography of localities, in the temperature of soils at four and six feet deep, in the rise and fall of ground-waters; it failed because it sought them, very painstakingly and exhaustively, it is true, in every place and in everything where they were not.”48 As William Sedgwick, the eminent turn-of-the-century microbiologist remembered, “. . . before 1880 we knew nothing; after 1890 we knew it all. It was a glorious ten years.”26 In the early years of the 20th century, the trick was to translate science into practice and to define what local health officials considered the “fundamentals.” For the most part, however, the everyday activities and tools of departments of health remained: “quarantine, isolation, immunization and disinfection.”49
In city after city, although no vaccines, antitoxins, or other technologies were available, older, traditional tools such as quarantine, isolation, suspension of school and church meetings, and limits on public gatherings were used to lessen contacts that might lead to the spread of the epidemic.2 “The only special measures to be adopted to prevent crowding were the closing of places of public amusements,” pointed out Health Commissioner Robertson while lauding Chicago's “low death rate from influenza and pneumonia as compared with practically all eastern, southern and western cities.”50 In Richmond, the long history of racism and reaction had left the city in desperate straits by the summer of 1918.51 “No amount of forethought. .. could have prepared us for the tidal wave of disease and death that all but overwhelmed the city,” noted the commissioner of health.51 The city asked that all dances be postponed, and the residents were told to “. . . avoid crowds, streetcars and place of poor ventilation.” In the coming weeks, the department suggested closing churches and theaters and avoiding “unnecessary travel,” and even suggested that kissing “. . . should be stopped. . . .” The State Fair soda parlors were closed, and emergency hospitals were opened.51 In Minneapolis, the city hospital was effectively quarantined and specified wards were dedicated to influenza cases. Other private hospitals took on non-infectious patients, leaving the city free to turn the entire institution into a quarantine facility. St. Paul, across the river, planned to close schools, churches, and other gathering places. At the University of Minnesota, the administration postponed the opening of the fall semester. Streets were flushed and swept because of fears that “germs lurked in dust.” When school officials hesitated at closings, the health officer argued that he “intend[ed] to use the police force if necessary. . . .” When the epidemic did not abate, “. . . saloons, cafes, and soda fountains” and other forms of entertainment were banned.52
In general, despite the attention to the value of the germ theory in transforming the disease experience, the influenza epidemic appeared to confirm the value of traditional sanitary measures. For the most part, public health department personnel as well as those involved in direct care of patients credited the efforts to isolate and distance the victims of disease, general sanitation, and the police powers of the health department with the short-lived nature of the epidemic and with the decline in death rates following interventions. In 1920, shortly after the second epidemic of influenza had passed, the acting commissioner of the Tenement House Department of New York City wrote a brief note to Lillian Wald, then at Red Cross headquarters. He asked that “. . . all the forces in the City should be prepared to combat [another] epidemic” by preparing means “. . . to see that the general sanitary conditions of the City are maintained at a high standard.”53
Today, laboratory science has made enormous strides in identifying what were once the invisible viruses that caused the 1918 outbreak. Yet, for the most part, despite our advances, the basic means of addressing influenza remain the same as those nearly a century ago. Public health education, isolation, sanitation, lessening congestion, closures, and surveillance are essential tools. These elements of public health practice were developed over the long course of American history and have protected us in myriad ways. The Institute of Medicine has argued that public health is defined by “What we, as a society, do collectively to assure conditions in which people can be healthy.”54 This may seem like an overly broad, amorphous definition that can include virtually every human activity. Yet, as the history of public health shows, it is not necessarily inaccurate.
In his massive history of public health, George Rosen began his final two chapters on the bacteriological revolution of the previous half century with a short hypothetical: “A sniffling, coughing New Yorker who turns to a friend in the subway and says, ‘Gee, I have a virus!’ is expressing colloquially a theory of infection that has momentous, even revolutionary, and certainly unanticipated consequences,” he begins. “Outstanding among these consequences is a virtual eradication. .. of communicable diseases. . . . Once-dreaded diseases. . . are a thing of the past.”
What accounted for this dramatic change? “These effects stem directly from the incontrovertible demonstration toward the end of the 19th century that specific microscopic creatures rather than vague chemical miasmas produce infectious diseases.”55 But the past was more ambiguous than Rosen and others would have it and the nation's experience with the influenza epidemic was a major case in point. Coming as it did at the end of a decline in the importance of infectious disease and at a moment when the nation expected the laboratory and public health to deliver on its promise, the epidemic was an unwanted reminder of the limits of the new public health. But the epidemic held another message: that the public health techniques of a previous era still had relevance and even the power to stem disease.