“[There is] a significant need for the discussion, from an interdisciplinary perspective, of the ways that infectious diseases have played a substantial role in shaping human societies and continue to pose a threat to their survival.”—Frank M. Snowden, Professor Emeritus of History and History of Medicine at Yale University[1]

Despite the modern definition of pandemic only becoming clearly established around the 1900s, pandemics and their slightly less severe counterparts, epidemics, have troubled humanity for as long as we have written records of humanity itself.[2] Used to describe a dangerous disease prevalent throughout an entire country, continent, or the whole world, the term pandemic and the diseases it describes have long been feared by the public as global disasters. Epidemics, or diseases that affect many people at the same time, equally trigger terror in the minds of the people. While epidemics and pandemics encompass different levels of the spread of a disease, they regardless both serve as an instigator of panic in their affected communities. A list of the most notable, or infamous, outbreaks in history would be incomplete without noting the Black Death, Spanish Influenza, cholera, smallpox, and HIV/ AIDS, but it is impossible to list every major disease outbreak. Even the diseases that, comparatively, did not become mass killers, such as polio, left painful scars on human history that will not be soon forgotten.[3] The historical impacts and influences of widespread disease on a societal level are often left unstudied; in particular, the other side of disease history is left as an untold story—namely, the unsuspected positive outcomes of pandemic tragedies.[4]

 To be explicitly clear, pandemics and epidemics are devastating in nearly every aspect despite varying in fatality—and they are perceived to be wholly devastating by communities at large. There is a general consensus that large scale epidemics are harbingers of prolonged calamity, at least until the disease in question is handled or comes to an end naturally; Yale Professor Frank Snowden, in one of his interviews, even calls epidemics “devastating challenges to our humanity.”[5] However, the notion that epidemic and pandemic diseases have only created unmitigated disaster needs to be reevaluated. The outbreaks of diseases have actually encouraged some surprisingly positive effects on human society. By delving into some of history’s notable pandemics and epidemics, a more complete picture of the impact of these disease outbreaks have on society, and thus the need to place more emphasis on the positives as well as the negatives of historical pandemics, can be seen. To analyze the positive aspects of epidemics and pandemics in search of finding a generalizable pattern, this research will tilt towards a constructive perception of an outbreak. While it is undeniably important to reflect upon the devastating consequences of pandemics, the positive outcomes garnered should not be ignored in the grand scheme of history.

The Economics of Disease

Humanity generally views a pandemic as a guarantor of future economic recession, depression, and even further disaster. The International Monetary Fund, for example, notes that “infectious diseases … such as influenza, fluctuate in pervasiveness and intensity, wreaking havoc in developing and developed economies alike.”[6] Though not wrong, there is still a pattern of relative economic benefit emerging from such plagues and diseases, one of the most well-known of which is the Bubonic Plague.

The Bubonic Plague, often referred to as the Black Plague or Black Death, devastated global society, caused mass panic and fear, and at some point in history has struck nearly every part of the world; for many, the Black Plague is the epitome of a pandemic and remains firmly rooted in history as one of the worst diseases to have ever existed. The plague’s most notable outbreaks occurred in three waves, beginning in 541, 1347, and 1855, but the mass 1347 outbreak in Europe was by far the most destructive and is the most well-known—in fact, the term Black Death is most frequently used now to refer only to this period.[7] This outbreak is famous for its devastating elimination of much of the European population: from 1348 through 1350 alone, Europe lost, even in the most conservative estimate, one-fourth of its people to the plague.[8] Yet despite its disastrous consequences and understandably infamous reputation, it cannot be ignored that a significant portion of the population fared far better economically in the period that followed this catastrophe.

         The claim that the lower class of the medieval era benefited economically from the plague is not an unfamiliar one. With casualties in the hundreds of millions, the death toll of the plague produced greatly varied economic consequences, but the most immediate effect perhaps was the skyrocketing prices of goods due to extreme shortages in the labor force.[9] As a result of these shortages, the value of the workers themselves increased, leading to rapidly rising wages: for example, wages in Paris quadrupled in just four years, from 1351 to 1355.[10]

While the workers benefited from this increased wage, many governments at the time desperately attempted to lower the increasing wages of low-class workers through various approaches. One notable example of such an attempt was Castile’s introduction of a new policy that threatened whoever did not comply with trading goods at lower, pre-plague prices with corporal punishment of up to sixty lashes. Such stringent measures employed by many monarchs and the ruling classes, however, were most likely futile in stopping the wages from rising. As one Florentine Decree in the midst of the plague stated, “while many citizens had suddenly become the poor, the poor suddenly became rich.”[11] Although some historians, such as A. R. Bridbury, are skeptical of the claim that the Black Death depleted the workforce to such an influential degree, some other historians, including Samuel Cohn, open up the possibility that the labor force could have made at least “modest gains” through the rise in wages.[12] In addition to the increase in wages, many of the survivors likely gained wealth through other mechanisms as well. Many of the surviving people inherited great amounts of land or wealth from their deceased relatives, which came as another form of economic relief.[13]

Despite such relief, it is also crucial to point out that not all people might have received such opportunities for economic growth; there were great differences in the extent to which cities and rural communities benefited.[14]A majority of the agricultural communities with relatively small populations mass migrated into cities, leaving their former land deserted. Many of the migrating peasants were offered to work under the relatively prosperous townsmen and lords in cities.[15] Yet, the deserted lands incentivized landlords to reorganize the overarching structure of  economy in these regions, focusing on capital-incentive agriculture which later resulted in rural areas transforming to proto-industrial economies.[16] In both cities and rural communities, however, many landlords implemented a variety of assistance programs for the lower class. Though there are conflicting explanations as to why the landlords were willing to lower or even excuse rent, one suggestion is that their tenants, mainly laborers, threatened to simply leave, thus incentivizing the landlords to create policies that would preserve as many of their tenants as possible.[17] Such incentives would come in the form of rent exemption for the laborers for at least two years and relatively low rent for the remainder of the century.[18]

While the opinions of scholars differ on the extent to which the lower classes benefited economically after the Black Plague, it appears that even in the midst of such adversity and disaster, the lower classes were not completely destroyed and in fact at least moderately benefited from the second wave of the Black Plague in 1347.

         Probing further into the surfacing economic benefits of pandemics, it is not difficult to find a more recent historical case of emerging economic development from a crisis. Smallpox, a disease of unknown origin that can be dated back to the fourth century, has troubled humanity repeatedly, playing crucial roles in the 16th century and onwards, affecting almost all continents.[19] But the virus, described by historian Bob H. Reinhardt as a “fearsome killer,” more recently caught President Lyndon B Johnson’s—and the world’s—undivided attention once again in the 1960s.[20] The Center for Disease Control (CDC) launched the Smallpox and Measles Eradication Program (SMP) in 1966, and the World Health Organization (WHO) promptly followed with its own Intensified Smallpox Eradication Program in 1977.[21] The exhaustive efforts from both the United States and the international community resulted in the eradication of the virus in 1980.[22]

         The mechanism by which the virus encouraged economic benefit seems simple: humanitarian aid and economic development. Though its motives may be questionable, the United States diverted much funding towards stemming the smallpox crisis in many underdeveloped countries in Africa and, over the course of eradication, encouraged natural economic development in these nations. However, America’s justification for dedicating such efforts to stamping out the virus is not without debate. Its intentions are most likely explained in what is deemed by Reinhardt as Washington’s adoption of a “non-communist manifesto,” which largely governed the country’s economic foreign policy at the time.[23] The United States aimed to intercept the infiltration of communist ideals into then third world countries and instill instead American-style capitalism.[24] However, such concealed motives in the United States’ approach to foreign policy do not change the fact that smallpox and its crisis most likely stimulated United States’ foreign aid and development that led to better economic prosperity in the receiving countries.

         Even prior to Johnson’s administration and its stance on smallpox, the United States Agency of International Development (USAID) had already been pouring billions of dollars in aid to countries escaping colonialism in the early 1960s. Unsatisfied, President Kennedy pointed out that such methods were “bureaucratically fragmented, awkward and slow.” However, a revolution to the approach came under Johnson’s administration as he believed in a strong connection between international health and economic development.[25] The United States, working in close conjunction with the WHO, had plans to lift people from poverty in less economically developed regions, and one tactic in particular reflects Johnson’s belief in the link between healthcare and the economy: the aid eradication programs funneled money into epidemiological training for local health officials and workers.[26] Some of these African vaccination assistants later worked in the National Ministries of Health in their own countries to further their training and research, only furthering both economic and medical developments in some African nations. The smallpox virus provided a practical crisis to be solved, and such tangible triumph—defeating a killer virus—brought attention, focus, and enthusiasm to humanitarian efforts conducted by the United States.[27] This sense of urgency, likely drummed up by the smallpox crisis, contributed greatly to the trend of donation and humanitarian aid at the individual and government level as the Smallpox Eradication Program received a total of 98 million dollars from international donors. This trend has also continued, with a 126 billion dollar increase in global foreign aid donations from the 1960s to 2016.[28]

Yet, some historians are still doubtful of the extent to which the smallpox eradication efforts provided any horizontal aid, or domestic infrastructure and long-term solutions. Reinhardt, for example, also points out the CDC and WHO focused more on the “technical and logistical challenges of eliminating smallpox, not on developing local economies.”[29] However,  as authors Lucy Page and Rohini Pande stress, invisible infrastructure, including healthcare systems and government stability, rather than simply hospitals and roads, is a true necessity in developing nations.[30] Given this, the notion that the smallpox pandemic brought positive attention to the underdeveloped world and provided some structural and economic development to create more long term healthcare solutions cannot simply be dismissed, even if the motives of the nations involved also deserve reassessment.

Advancements in Medical Technologies and Strategies

         With the massive levels of illness and death that these pandemics inevitably incur, perhaps a more expected benefit the diseases can encourage is that of medical innovation. In the face of an epidemiological crisis people become incredibly eager for a solution, whether that solution be curative or preventative. Looking back on these innovations from a modern perspective, they can at times appear blatantly obvious, and such progress is therefore taken for granted. However, the medical innovations brought about by times of epidemics and pandemics necessitate a serious reexamination of the impact of a crisis, as such innovations not only provide a solution to the given crisis but also serve as a springboard for further humanitarian and medical revolutions.

         Not the least of these medical innovations is the development of epidemiology resulting from the London cholera outbreak. When cholera made its first appearance in London, early encounters with the virus spurred fear throughout the population that this was a return of the Black Death.[31] Cholera first arrived in nineteenth-century London in the midst of an industrial revolution, and it was described by the Europeans as a disease of society; others dubbed it the “blue death,” clearly hearkening back to the terror of the Bubonic Plague.[32] Alluding to one of the worst pandemics in human history further illustrates the contemporary belief that the cholera epidemic was yet another fiasco for all humanity.[33]

In fact, Britain’s attempts to secure the outbreak failed for decades, until the now famous John Snow’s contribution to the study of the disease.[34] During the second outbreak from 1829 to 1851, Snow traced the cases of infection throughout the city, which led him to conclude that cholera was likely caused by the contamination of drinking water by sewage water.[35] This discovery is still memorialized in London today, as a handle-less water pump remains in Golden Square partly in remembrance of John Snow’s contributions to the understanding of the disease.[36] The reason for his memorialization is quite clear, as his methods of tracing back and locating the source of the disease contributed to an entirely new field of study known as medical geography or medical cartography. This technique of logical correlation and statistical analysis has only been further reinforced and developed over time, as medical cartography has been used, and will continue to be used, to solve a diverse array of public health crises. Professor Candice Welhaousen asserts, for example, that “as public health continues to move toward a global health perspective in the 21st century, understanding how mapping constructs and shapes knowledge about disease, illness, and health will become increasingly important.”[37]

 Yet medical cartography alone is not the only reason to memorialize Snow; his work with cholera also contributed greatly to epidemiology and our modern understanding of the spread of diseases. Even until the turn of the twentieth century, diseases were still largely understood as the result of exposure to poisonous, harmful auras or miasmas diffusing from the ground.[38] Such a narrative, one that characterized sickness as almost supernatural, neither aided the development of practical, logical solutions nor provided hope that illnesses could be resisted. Snow’s contributions, however, pointed only to logical causes of the disease outbreak, demonstrating that diseases could both be traced and therefore be prevented. For these advancements, he is credited with making great headway in founding and developing epidemiology as a legitimate field of study; Professor Wayne Melville goes so far as to call Snow the father of modern epidemiology.[39] Since these initial steps taken by John Snow, epidemiological techniques have only expanded further, as Harvard Professor Dade W. Moeller notes that modern techniques include examining “the effects of a variety of chemical and physical agents within the environment.”[40]

And Snow was not the only one to contribute to medical innovation during the time of cholera. Many physicians and chemists were spurred on by the crisis, leading to the development of intravenous fluid therapy and oral rehydration therapy to combat the actual symptoms of cholera.[41] Such medical advancements show that though cholera was, and still can be, an incredibly virulent disease, we cannot underestimate the medical advancements it perhaps catalyzed. 

Cholera also is not the only disease that resulted in great medical advancement—in fact, many major disease outbreaks lead to some development in medical technique or understanding. Smallpox, which also served as a stimulant for economic development in some nations, is perhaps even more well known for its complete transformation of the way we approach disease prevention by way of the vaccine. Given the virulence of smallpox, it is not at all surprising that medical innovations followed its multiple historical pandemics.[42] The smallpox vaccine began with inoculation, first popularized in England by Lady Mary W. Montagu upon being infected with smallpox herself in Turkey, or what was then the Ottoman Empire.[43] In 1721, when Lady Montagu first returned to England, inoculations were highly controversial. The hesitance to implement inoculations on a broad scale largely came from the relatively high death rate of inoculated patients, tolling up to one to two percent.[44] This completely changed when in 1798, Edward Jenner introduced a modern form of vaccination, using cowpox to inoculate patients rather than the actual smallpox virus.[45] Despite having made vaccination much safer through the use of cowpox, Jenner’s vaccination techniques involved arm to arm transfer of the live virus and once again incurred widespread objection.[46] Yet Jenner did not give up in his strides to establish vaccination as a regular practice, and Jenner, along with other physicians, created a vaccination institution to better implement the practice.[47]

Their lasting efforts have been evidenced in contemporary sources, such as an 1804 Issue of the Gentleman’s Magazine that argues it is “impossible to preserve the child from [smallpox] except [by] inoculation.”[48] Even Napoleon himself viewed Jenner’s research as influential enough to negotiate and allow for the release of two Englishmen upon Jenner’s personal request.[49] The significance of Jenner’s discovery and medical revolution to this day can simply be explained by the fact that smallpox remains as the only eradicated human disease so far in history. Though it is impossible to prove with any certainty, it seems that without Lady Montagu’s smallpox infection and Jenner’s response to the smallpox pandemic, freeze dried vaccines, refrigeration, and air powered jet injectors may have taken much longer to develop.[50] The critical role which vaccination technology now plays in modern medicine is demonstrated by something as simple as seasonal flu; the seasonal flu unfortunately kills tens of thousands of people every year, and our only real defense against it is vaccination.[51] The vaccination technology developed initially by Jenner saves lives every day, having transformed from a controversial technique to an essential medical technology.

Moreover, the smallpox virus itself still provides modern science with medical benefits even after its eradication, as samples of this virus remain in storage to this day in Atlanta and Moscow.[52] The argument for destroying these last remaining samples gained traction in recent years, but it was quickly dismissed on the grounds that the virus samples may still have some utility, providing information for further study.[53] One theory might be that the molecular structure of the virus itself could be utilized by humans; in 1993, several scientists, including Wolfgang K. Joklik and Bernard Moss, petitioned against the destruction of the stores, arguing that “retaining the smallpox virus stocks in Atlanta and Moscow and studying in detail their molecular pathogenesis would be of enormous benefit to humanity.”[54] This argument for retainment was further bolstered by an independent committee of experts from the WHO which advised that the virus be capsuled and stored in the hopes that it will become the key to solving some future health crisis.[55] Even diseases themselves, through studying them at a molecular and biochemical level, can become opportunities for medical advancement.[56]

From cholera to smallpox, from medical cartography to the development of vaccines, this broad range of evidence indicates that dire catastrophes do often allow humanity to take a leap forward in advancing medical technologies and research. The destructive nature of the diseases that served as the stimulus for these advancements should not be ignored, but in the same vein, the benefits that they have brought about for future generations who will undoubtedly face further pandemic disease cannot be removed from the overall picture either.

Municipal Reform and Strategies Against Diseases

         Medical technologies themselves are not the only medical advancements often made during an outbreak or pandemic situation; a series of municipal reforms often occur during such crises that suggests a strong correlation between an outbreak and structural changes or developments to public health policy. Given this, pandemics could in fact contribute to lasting positive changes within healthcare systems.

One of the earliest recorded examples of strategies developed to prevent disease can be found in the methods employed throughout Europe to prevent further spread of the Black Plague. Both in the second and third waves of the Black Death, individuals looked to migration as a way to stymie the spread of the disease: the plague incentivized mass migration in fourteenth-century Europe, and once again in the seventeenth century, large numbers of people from London began evacuating themselves to rural areas to avoid what they saw as a hotbed of disease.[57] This strategy has since been repeatedly applied throughout history, such as more recent examples like the American polio outbreak of the early 1900s. At that time, parents frequently limited their children from outdoor activities; during the cholera outbreak of the 1800s, India also banned mass gatherings such as pilgrimages.[58] These intuitive individual strategies coupled with government encouragement or enforcement has proven an effective public health defense measure.

Beyond simply avoiding contact with other people, however, governments have, at times, had to expand upon these policies to include education on the purpose of maintaining social distance. This necessity was highlighted by the HIV/AIDS outbreak in America: after it was realized keeping distance alone was not going to be enough, explicit education in sexual health was implemented to address the spread of the disease, which included instruction on the safe use of condoms and the need for frequent health screenings. By incorporating education regarding the most common transmission mechanism for HIV/AIDS, the United States government, as well as many health organizations around the country, helped significantly stem the spread of AIDS.[59] One unanticipated additional benefit of this education, however, was its ability to limit the spread of other sexually transmitted diseases as well. The sense of urgency that follows a pandemic crisis can, therefore, encourage behavioral changes that directly lead towards effective strategies to combat mass outbreaks as well as regular diseases. When faced with a disease crisis, governments often implement structural educational reforms to include education in basic hygiene practices such as handwashing, but this education can greatly improve overall hygiene even after the crisis has ended.[60] While the emergence of such practices involving isolation, quarantine, and educational reform may seem simple, the effectiveness of such simple strategies should not be disregarded.

But these strategies cannot be implemented on a national or global scale without the existence of centralized, sophisticated public health organizations, which can also be attributed to historical pandemics. As early as the second wave of the Black Death, governments began to establish public health oriented organizations known as health magistrates, which administered any public health activity with unlimited legal power, enforcing quarantines and other early forms of preventative measures.[61] Centuries later, in the 1800s, epidemics “provid[ed] a catalyst for municipal reform and the development of public health,” as described by Oxford professor David Arnold. In his studies on the 1800s cholera outbreak, Arnold notes that although it had already been established, India’s medical board was further strengthened during the outbreak through public health reform.[62] One of the most well-known global public health organizations, the World Health Organization, developed programs to monitor and study the influenza virus immediately after its founding in the 1940s—a virus that had killed more than thirty million people at the start of the century.[63] Organizations such as the WHO and the CDC have since extended their work beyond simply studying and containing specific diseases to providing support to countries developing health infrastructure, advising nations on health policies, and leading the world in medical research.[64] Such extensive efforts from specialized public health organizations continue to benefit global society, in some ways mitigating the harm caused by the pandemics that contributed to their establishment in the first place.

Sociocultural Influence of Diseases

         Perhaps the most surprising result of a disastrous disease outbreak, however, is the extent to which these diseases wield their influence over seemingly unrelated aspects of our culture and society. Unlike economic impact or progress in medical technology or policy that is perhaps easier to anticipate, the sociocultural impact brought about by an epidemic or pandemic is more sophisticated and more difficult to predict. The nature of the disease itself and the impact that it has on society in terms of death and injury or the economy all affect the social impact of the disease. However, when these factors are considered, some patterns of societal or cultural influence seem to emerge.

         One such social impact that an epidemic can contribute to is widespread social awareness of the plight of certain groups of people. When smallpox struck Dublin, Ireland, in the late nineteenth century, the Irish government ceded control over the city’s sanitary policies to the Dublin City Council in 1898. Severely lacking legitimacy to implement public health policies, the council was further plagued by a distinct public disinterest in the healthcare system despite the severity of the outbreak. Desperate to encourage active participation in policy creation, the council turned to groups that had historically been disenfranchised politically as a new source of public opinion. In a perhaps unanticipated turn of events, the Dublin City Council lowered the minimum wealth requirements for voting as well as granted women—if they possessed private property—the right to vote; the number of citizens eligible for voting almost quintupled in some districts as a result of this reform.[65] When the fact that universal women’s suffrage was only granted at the national level by the Irish government in 1922 is considered, the influence that smallpox was able to hold over Dublin and its earlier consideration of underprivileged groups is more apparent.[66] In a sense, the outbreak of the disease seems to have forced the government to pay greater attention to the interests of all subjects under their power.

         Other groups of underrepresented minorities also saw improved conditions after the outbreak of epidemic disease; this phenomenon of increased awareness and elevation of oppressed minorities did not end in Dublin. Obscured by other disastrous pandemics such as smallpox and Spanish Influenza, polio was at first unnoticed and underestimated before slowly becoming the center of attention in the United States.[67] Thousands of people were infected with polio annually from 1910 until an effective vaccine was developed in the 1950s.[68] The symptoms of polio range from no symptoms at all to complete paralysis or even death.[69] For the thousands infected annually, many suffered paralysis and other permanent disabilities from the disease, directly contributing to the Disability Rights Movement by drawing attention to the disabled people trying to pursue normal, full lives in a society that was not designed for them. Some courageous polio patients who were either completely or partially paralyzed became role models for other polio patients, and even for people with other disabilities.[70] Edward V. Roberts was among these disability activists and portrayed as a “Champion of the Disabled” by the New York Times following his death in 1995. Earning a master’s degree in politics, Roberts tirelessly dedicated his life to improving disability rights; some of his work includes helping establish the World Institute of Disability and Center for Independent Living, which advocated for safe access to public transport and nondiscrimination against disabled employees.[71] Roberts was not the only disability advocate, however; former president Franklin Roosevelt also suffered from the effects of polio even towards the final days of his presidency.[72] Although he was portrayed as a strong leader and a role model for patients with polio, his recovery alone did not capture the attention of the public and encourage greater support for disability rights.[73] Roosevelt and his prominent position in the public eye, however, may have greatly contributed to the number of political activists. The media attention from the public and the movements enacted by the rising number of disabled people in the 1960s did majorly contribute to the Disability Rights Movement.[74]

Although only less than one percent of patients with polio experience permanent full or partial paralysis, it is an undeniable fact that people infected with polio are both psychologically and physically burdened.[75] When a disease strikes society, it is understandable that those directly affected by it focus their efforts on preparation to ensure that such tragic diseases affect as few people as possible; however, those who were left with more difficult conditions as a result of polio, who shared their stories of living with partial or complete paralysis, should also never be forgotten.[76] Because these people shared their experiences, garnered media attention, and fought for their right to live full lives, they aided in elevating the lives of the disabled in general, not just those who had suffered at the hands of polio. Partially as a result of their efforts, Section 504 of the Rehabilitation Act was signed into law in 1973, stating that “no otherwise qualified handicapped individual in the United States shall solely on the basis of his handicap, be excluded from the participation, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federal financial assistance.”[77] Although it is unfortunate that such societal advancements are sometimes spurred on by tragic or painful events, the societal benefit granted by such laws should still be acknowledged.

         Increased social awareness of minority groups is therefore one sociocultural effect that may result from an epidemic or pandemic, but mass disease outbreaks can also change philosophical perceptions and the way that societies view the world around them—especially in regard to life and death. One prominent example of such perception change is evidenced as early in history as the Black Death. One of the most disastrous pandemics of history, the plague of the fourteenth century, resulted in the death of around a third of the population in the most reasonable estimate, likely killing at least one person in every family.[78] As the mechanism for the disease’s spread was not well understood, one interesting argument put forth by author Philip Ziegler is that, to a certain extent, the plague led many members of society to feel somewhat abandoned by God, forcing them to take on a different view of both death and the world around them. Prior to the Black Death, there was a widespread belief that God would spare innocent children from such a horrendous plague.[79] Such belief was relentlessly shattered as the Bubonic Plague did not discriminate against age or social class, and for many, it became increasingly difficult to view the plague as merely a punishment sentenced by an outraged God to take the lives of sinners. Constantly surrounded by death and events that to them seemed inexplicable, an increasing number of people may have opened up to the possibility that there might be no God after all. However, it is best to assume that no one would have openly discredited or entirely renounced belief in God, as the world of fourteenth-century Europe was still highly religious. In fact, though historians like Ziegler acknowledge that some at the time may have doubted God or at least doubted God’s benevolence for humankind, it is important to note that the Black Death also sparked even more devout religious fervor among others, such as the Flagellants. These pious individuals travelled throughout Europe and publicly beat themselves in the hopes that it would appease God and end societal suffering.[80]

         More recently in history, the transformation of public perception as the result of a major epidemic can be seen yet again during the American HIV/AIDS crisis. HIV, short for Human Immunodeficiency Virus, destroys the immune system by attacking important defense cells, effectively making it incapable for an individual’s body to ward off any other diseases; Acquired Immunodeficiency Syndrome (AIDS) is the most alarming stage of the HIV infection, as this means that a majority of the body’s defenses have been compromised, exposing the individual to opportunistic illnesses. This debilitating infection and resulting condition only came under the radar of the CDC in 1980.[81] Though the illness can be transmitted by a person of any sexual orientation, in the 1980s it spread rapidly throughout communities of homosexual men, and HIV/AIDS is still considered a global issue, as it affects communities in America and Africa.

In light of the community most largely affected by the AIDS epidemic, there are two competing theories for its impact on societal perceptions of marginalized groups. The first of these is often the more familiar, which argues that the AIDS epidemic emboldened already existing discrimination against members of the gay community and drug users, as these communities saw disproportionately high rates of infection; the disease therefore only justified discrimination to some, making their prejudicial narratives worse. On the other hand, a competing idea is that, at least within the affected communities themselves, the crisis stimulated more rigorous support for volunteer organizations and called for the voices of these communities to be listened to more readily. For example, the epidemic gave birth to community based organizations (CBOs) like New York City’s Gay Men’s Health Crisis and the Chicken Soup Brigade in Seattle.[82] In many regions these CBOs closely cooperated to improve the lives of people affected by the epidemic.[83] The significance of these CBOs stems from both the fact that they played a crucial role in improving the lives of the gay community, and the perception that the US government was, at times, incapable of handling the crisis alone. Recalling how polio sparked a similar movement, the AIDS epidemic therefore likely did lead to the development of voluntarism and more personalized care for affected people within the community; on the nationwide scale, however, the crisis did change public perception of gay people but perhaps not for the better.

Yet another influence held by the AIDS epidemic hearkens back to one of those also wielded by the Black Death: it changed some perceptions of death. Prior to the AIDS epidemic, a disturbing societal trend persisted of avoiding discussion of death entirely. Even medical doctors, for example, were cautious to reveal critical health issues to patients directly, with only ten percent of doctors preferring to openly disclose such information. Yet, this number was brought up to ninety seven percent by 1977 partially as a result of the AIDS epidemic, as the number of critical patients suffering from AIDS continued to increase and doctors began informing patients.[84] Societal acceptance or avoidance of death at large is a much more difficult topic to approach, but at a minimum, a more honest healthcare industry is an undoubtedly positive change.

For all the devastation that they cause, pandemics undoubtedly wield a powerful influence over the way that humans interact with society around them. However, what is perhaps less anticipated is just how closely these pandemics are tied to societal structures beyond healthcare and the economy, and that this influence is not always destructive in nature; in light of the unpredictable nature of diseases and the inevitability with which humanity will continue to face them, this effect ought not to be ignored in spite of the tragedy that disease outbreaks often bring about.

Diseases Affecting Global Politics

In the broader context, however, the one tragedy that may be seen as more catastrophic than a global pandemic is that of a global conflict. In the early twentieth century, when the war between global powers fell to a stalemate and casualties had reached the millions, the Great War, or World War I, in simple terms was a disaster for an entire generation of young soldiers and a cataclysmic event in human history.[85] Despite the United States’ initial reluctance to join the war, with increasing public support the US did declare war against the Central Powers in 1917, truly elevating the war to a global conflict. Many accounts of the war argue that the stalemate ended with American involvement, such as historians Mark T. Calhoun and Raymond Callahan who argue that America’s presence was invaluable;[86] however, the contention that the Spanish Influenza also contributed to the end of the Great War is largely underestimated.

To almost the same devastating degree as that of the war itself, the Spanish Influenza killed millions of people between 1918 and 1919—with such a lasting impact, the epidemic certainly had negative consequences on society as a whole.[87] Professor Robert A. Clark, however, also poses the question, “Did the Spanish Flu outbreak help bring World War I to an otherwise premature end?”[88] Though at first look this may sound extreme, the answer to Clark’s question appears to be yes. The way in which the Spanish Influenza tipped, or at least played a role in tipping, a tight stalemate in World War I to the Allies’ favor is two-fold. First, contemporary evidence suggests that governments were often too burdened physically to actually continue fighting in the ever-escalating conflict. The 1918 Sun article “Influenza Epidemic Delays German Drive” describes how the German forces were severely hindered by the influenza outbreak in their trenches from launching an aggressive offensive operation.[89] Erin Ludendorff, a German officer leading a massive German spring offensive at the time, substantiates this idea when he attributes his failure to the outbreak of the disease.[90] Historian Kathleen M. Fargey even goes so far as to rename the Spanish Influenza the “Deadliest Enemy”; she also notes the strict sanitary measures for soldiers undertaken by the United States’ army, such as gargling twice a day and spraying soldiers’ noses and throats with an unspecified liquid, likely a disinfectant.[91]

Further highlighting one reason for the United States’ hesitancy towards an extensive war effort, another 1918 article in The Sun notes that a ship of British soldiers was in fact turned back because they were likely diagnosed with the Spanish Influenza.[92] It may have then been understandable that the participating countries felt an additional pressure to bring the war to a close as casualties mounted on a different battlefield altogether. Second, the Influenza Pandemic led to a breakdown of governments and war support from the inside out. Historical scholar J. E. Mueller points out that “the public withdraws its support for war when the number of soldiers killed in the war increases,”[93] including, during World War I, those from the Spanish Influenza. Given this, governments logically would have sought measures to minimize casualties and found it increasingly difficult to continue to justify the war effort as soldiers died both in battle and from disease. Some governments, such as that of France, went so far as to ban reports on Spanish Influenza casualties to minimize the influence of the epidemic on morale at home.[94] Despite the governments’ extensive efforts to limit, or sometimes even systematically hide, news coverage on the spreading virus, public support for the war quickly waned, in part due to mass fear of the virus.[95] Without the support of the people, governments have much more difficulty sustaining a war effort, because public support is crucially linked with the nation’s ability to recruit troops and maintain high morale among those already drafted or enlisted.[96]

The Spanish Influenza did not take sides in the war. It cannot be argued that the epidemic contributed to bringing one specific side of the conflict to defeat; however, the outbreak likely accelerated the end of the war by forcing governments to relocate some of their efforts towards epidemic relief. In that sense, the war between humanity was only further compounded by the war against the virus.


         Ultimately, there is no one definitive answer as to how pandemics and epidemics shape the world. But in the broadest sense, epidemics and pandemics seem to bring to the surface a pre-existing problem or fragility in society’s infrastructure through the alarming nature of the crisis itself. Such a large-scale problem effectively seizes the attention of society and often forces both the public and the government to address structural issues, therefore catalyzing some improvements. Throughout history, various epidemics and pandemics may have contributed to the end of a global conflict, helped produce significant advancements in medical technology and public health policy, and even elevated the status of disenfranchised groups of people. Although it may not be possible to predict that these impacts will hold true for all severe outbreaks, it is at least clear that epidemic disease does far more than simply cause high levels of casualty.

As we consider recent events and take stock of the fact that epidemics are not only a part of human history but instead part of our current global landscape, the mechanism by which an outbreak can highlight weaknesses in society and offer opportunity for improvement ought not to be ignored. Viruses like influenza constantly mutate and evolve to infiltrate human society, and other, less known diseases still seem largely unpredictable.[97] More recent epidemics—such as SARS, MERS, and the still ongoing threat of COVID-19—and their overarching impact on society are still to be analyzed, but that they did and will have great levels of influence is without question: SARS and Ebola both killed hundreds and thousands of people, and COVID-19 is likely to kill more people than these diseases combined in the United States alone, so the impact of modern pandemic is undeniable.[98] With such catastrophic and truly heart wrenching outcomes in mind, it must be explicitly noted that this is not to argue that epidemics or pandemics as a whole are a positive experience or something that society should hope to see more often. Rather, when society is faced with such a disaster, the pandemic can be viewed as an opportunity to “build a different world, a better world, a world where [future generations] can live better,” as Snowden describes of the potential future responses to the COVID-19 outbreak.[99] His point, to a certain degree, seems valid. Our goal when faced with a potential outbreak ought to be disease eradication and the development of mechanisms to minimize the negative impact of the illness, while magnifying the potential benefits that may arise as a result.

Regardless of whether this is true, however, perhaps we as a society should take note of these historical patterns and examine why it often seems to take such an extreme crisis to highlight societal or structural weaknesses.

Bibliography / Reference

[1] Frank M. Snowden, “Preface,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), ix.

[2] David M. Morens et al., “What Is a Pandemic?” The Journal of Infectious Diseases 200, no.2 (Oct 2009): 1019.

[3] Sophie Ochmann and Max Roser, “Polio,” Our World in Data. (Nov 9 2017).

[4] Snowden, “Preface,” ix.

[5] Isaac Chotiner, “How Pandemics Change History,” New Yorker.

[6] David E. Bloom et al., “New and Resurgent Infectious Diseases Can Have Far-reaching Economic Repercussions,” Finance & Development 55, no. 2 (June 2018): 46.

[7] Frank M. Snowden, “Overview of the Three Plague Pandemics: 541 to ca. 1950,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 28-38.

[8] William L. Langer, “The Black Death,” Scientific American 210, no. 2 (February 1964): 114.

[9] Langer, “The Black Death,” 115; Snowden “Overview of the Three Plague Pandemics,” 37.

[10] Samuel Cohn, “After the Black Death: Labour Legislation and Attitudes Towards Labour in Late-Medieval Western Europe,” The Economic History Review 60, no. 3 (Aug 2007): 462.

[11] Samuel Cohn, “After the Black Death: Labour,” 459-480.

[12] A. R. Bridbury, “Before the Black Death,” The Economic History Review New Series 30, no. 3 (Aug 1977): 403.; Cohn, “After the Black Death: Labour,” 481.

[13] Langer, “The Black Death,” 118.

[14] Cohn, “After the Black Death: Labour,” 479.

[15] Langer, “The Black Death,” 118.

[16] Cohn, “After the Black Death: Labour,” 459.

[17] Philip Ziegler, The Black Death (London, United Kingdom: Faber and Faber LTD Bloomsbury House, 2011).

[18] David S. Gillespie, “The Black Death and the Peasants’ Revolt: A Reassessment,” Humboldt Journal of Social Relations 2, no. 2 (Spring/Summer 1975): 6.

[19] Frank Fenner, “Smallpox and Its Eradication, 1969 to 1980,” in Nature, Nurture and Chance: The Lives of Frank and Charles Fenner (Canberra, Australia: The Australian National University E Press, 2006), 140.; Frank M. Snowden, “The Historical Impact of Smallpox,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 97.

[20] Bob H. Reinhardt, The End of a Global Pox: America and the Eradication of Smallpox in the Cold War Era (Chapel Hill, NC: The University of North Carolina Press, 2015), 52.; Bob H. Reinhardt, “The Disappointment of Smallpox Eradication and Economic Development,” in Global Transformations in the Life Sciences, 1945–1980, ed. Patrick Manning and Mat Savelli (Pittsburgh, PA: University of Pittsburgh Press, 2018), 47.

[21]Reinhardt, “The Disappointment of Smallpox Eradication and Economic Development,” 47; Frank Fenner,  Nature, Nurture and Chance, 137.

[22] S Bhattacharya, “The World Health Organization and Global Smallpox Eradication,” Journal of Epidemiology and Community Health (1979-) 62, no. 10 (October 2008): 909.

[23] Reinhardt, The End of a Global Pox, 60.

[24] Reinhardt, “The Disappointment of Smallpox Eradication and Economic Development,” 48.

[25] Reinhardt, The End of a Global Pox, 59-61.

[26] Bhattacharya, “The World Health Organization and Global Smallpox Eradication,” 909-910.; Reinhardt, “The Disappointment of Smallpox Eradication and Economic Development,” 55.

[27] Reinhardt, The End of a Global Pox, 98-121.

[28] Lucy Page and Rohini Pande, “Ending Global Poverty: Why Money Isn’t Enough,” The Journal of Economic Perspectives 32, no. 4 (Fall 2018): 183.

[29] Reinhardt, “The Disappointment of Smallpox Eradication and Economic Development,” 57.

[30] Page and Pande, “Ending Global Poverty,” 180.

[31]  Frank M. Snowden, “Cholera” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 234.

[32] Asa Briggs, “Cholera and Society in the Nineteenth Century,” Past & Present, no.19 (Apr 1961): 76.; Snowden, “Cholera,” 234.

[33] Laura Ball, “Cholera and the Pump on Broad Street: The Life and Legacy of John Snow,” The History Teacher 43, no.1 (Nov 2009): 105.

[34] Ball, “Cholera and the Pump,” 106.

[35] Stephen W. Lacey, “Cholera: Calamitous Past, Ominous Future,” Clinical Infectious Diseases 20, no. 5 (May 1995): 1410.

[36]  Ball, “Cholera and the Pump,” 105.

[37] Candice A. Welhausen, “Power and Authority in Disease Maps: Visualizing Medical Cartography Through Yellow Fever Mapping,” Journal of Business and Technical Communication 29, no. 3 (July 2015): 257–83.

[38] Frank M. Snowden, “Plague as a Disease,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 44.

[39] Wayne Melville and Xavier Fazio, “The Life and Work of John Snow: Investigating Science as Inquiry through Snow’s Work Involving Cholera,” The Science Teacher 74, no. 7 (October 2007): 41.

[40] Dade W. Moeller, “Epidemiology,” in Environmental Health, Third Edition (Cambridge, MA: Harvard University Press, 2005), 50.

[41] Lacey, “Cholera: Calamitous Past, Ominous Future,” 1411.

[42] Frank M. Snowden, “The Historical Impact of Smallpox,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 104.

[43] Diana Barnes, “The Public Life of a Woman of Wit and Quality: Lady Mary Wortley Montagu and the Vogue for Smallpox Inoculation,” Feminist Studies 38, no. 2 (Summer 2012), 330.

[44] Snowden, “The Historical Impact of Smallpox,” 105-106.

[45] Jose G. Rigau-Perez, “The Introduction of Smallpox Vaccine in 1803 and the Adoption of Immunization as a Government Function in Puerto Rico,” The Hispanic American Historical Review 69, no. 3 (Aug 1989), 394.

[46] Snowden, “The Historical Impact of Smallpox,” 108.

[47] E. Ashworth Underwood, “Edward Jenner: The Man and His Work,” The British Medical Journal 1, no. 4611 (May 21 1949), 883.

[48]  National Intelligencer and Washington Advertiser (Washington City), “From the Gentleman’s Magazine,” Chronicling America: Historic American Newspapers, Feb 20, 1804.

[49] Underwood, “Edward Jenner: The Man and His Work,” 884.

[50]  Snowden, “The Historical Impact of Smallpox,” 109-110.

[51] Jürgen Maurer PhD et al., “Support for Seasonal Influenza Vaccination Requirements among US Healthcare Personnel,” Infection Control and Hospital Epidemiology 33, no. 3 (March 2012), 1.; “Disease Burden of Influenza,” Centers for Diseases Control and Prevention.

[52] Wolfgang K. Joklik et al., “Why the Smallpox Virus Stocks Should Not Be Destroyed,” Science New Series 262, no. 5137 (Nov 19 1993), 1225.

[53]  Reinhardt, The End of a Global Pox, 192.

[54] Joklik, “Smallpox Virus Stocks,” 1225.

[55]  Reinhardt, The End of a Global Pox, 192.

[56] Joklik, “Smallpox Virus Stocks,” 1225.

[57]  Frank M. Snowden, “Responses to Plague,” in Epidemics and Society: From the Black Death to the Present (Newhaven, CT: Yale University Press, 2019), 59.

[58] Shell, “Remembering Roosevelt,” 103.; David Arnold, “Cholera and Colonialism in British India,” Past & Present, no. 113 (Nov 1986), 143.

[59] Aran Ron and David E. Rogers, “AIDS in the United States: Patient Care and Politics,” Daedalus 118, no. 2, (Spring, 1989), 42-48.

[60] Andrew Weeks et al., “Hand Washing” BMJ: British Medical Journal 319, no. 7208 (Aug 21, 1999), 519.

[61] Snowden, “Responses to Plague,” 69-70.

[62] Arnold, “Cholera and Colonialism,” 118-137.

[63] Adam Kamradt-Scott, “The Politics of Medicine and the Global Governance of Pandemic Influenza,”

International Journal of Health Services 43, no. 1 (2013), 107.; Adolfo Garcia-Sastre and Richard J. Witley, “Lessons Learned from Reconstructing the 1918 Influenza Pandemic,” The Journal of Infectious Diseases 194, (Nov 2006): 127.

[64] “What WHO Does in Countries,” World Health Organization.

[65] Ciarán Wallace, ed., “Feverish Activity: Dublin City Council and the Smallpox Outbreak of 1902–3,” in Healthcare in Ireland and Britain 1850-1970: Voluntary, Regional and Comparative Perspectives (London, United Kingdom: University of London School of Advanced Study Institute of Historical Research, 2014), 200.

[66] Catriona Crowe, “How Irish Women Won the Right to Vote in 1918,” The Irish Times, Dec 10, 2018.

[67] Julie K. Silver, “Polio—A Look Back,” in Post-Polio Syndrome: A Guide for Polio Survivors and Their Families (Newhaven, CT: Yale University Press, 2001), 1-2.

[68] Sophie Ochmann and Max Roser, “Polio.”

[69] Silver, “Polio—A Look Back,” 2.

[70]Amy L Fairchild, “The Polio Narratives: Dialogues with FDR,” Bulletin of the History of Medicine 75, (Feb 2001), 500.; J. Michael Elliott, “Edward V. Roberts, 56, Champion of the Disabled,” New York Times, March 16, 1995, Obituaries.

[71] Elliott, “Edward V. Roberts.”

[72] Silver, “Polio—A Look Back,” 8-9.

[73] Fairchild, “The Polio Narratives,” 500.

[74] Marc Shell, “Remembering Roosevelt,” in Polio and Its Aftermath: The Paralysis of Culture (Cambridge, MA: Harvard University Press, 2005), 199.

[75]J. Hamborsky, A. Kroger, and S. Wolfe, eds., “Poliomyelitis,” in Epidemiology and Prevention of Vaccine-Preventable Diseases, 13th ed. (Washington D.C.: Public Health Foundation, 2015); Silver, “Polio—A Look Back,” 5.

[76] Fairchild, “The Polio Narratives,” 488.

[77] Kitty Cone, “Short History of the 504 Sit In,” Disability Rights Education & Defense Fund.

[78]  Gillespie, “The Black Death and the Peasants’ Revolt,” 4.

[79] Ziegler, The Black Death.

[80]  Snowden, “Overview of the Three Plague Pandemics,” 31-32.; Snowden, “Responses to the Plague,” 65.

[81] “About HIV/AIDS,” Centers for Diseases Control and Prevention.; Ron and Rogers, “AIDS in the United States,” 42; “HIV Basics,” Centers for Diseases Control and Prevention.

[82] AR Jonsen and J Stryker, ed., “Voluntary and Community Based Organisations,” in The Social Impact of AIDS in the United States (Washington: National Academies Press, 1993).

[83] Ron and Rogers, “AIDS in the United States,” 50.

[84] Jonsen and Stryker, ed., “Voluntary and Community Based Organisations.”

[85] Dominic D.P. Johnson, “World War I,” in Overconfidence and War: The Havoc and Glory of Positive Illusions (Cambridge, MA: Harvard University Press, 2004), 83.

[86] Mark T. Calhoun et al., “World War I,” in General Lesley J. McNair: Unsung Architect of the U.S. Army (Lawrence, KS: University Press of Kansas, 2015), 46-49.

[87] Jeffery K. Taubenberger, “The Origin and Virulence of the 1918 ‘Spanish’ Influenza Virus,” Proceedings of the American Philosophical Society 150, no. 1 (March 2006): 86.

[88] Robert A. Clark, “Spanish Influenza 1918–19 Overview,” in Business Continuity and the Pandemic Threat (Cambridgeshire, United Kingdom: IT Governance Publishing, 2016), 91.

[89] Sun (New York), “Influenza Epidemic Delays German Drive,” Chronicling America: Historic American Newspapers, June 27, 1918.

[90] Clark, “Spanish Influenza,” 92.

[91] Kathleen M. Fargey, “The Deadliest Enemy: The U.S. Army and Influenza, 1918-1919,” Army History, no. 111 (Spring 2019): 30.

[92]  Sun (New York), “Troop Ship Turned Back by Influenza,” Chronicling America: Historic American Newspapers, July 7, 1918.

[93] Dukhong Kim, “Affect and Public Support for Military Action,” SAGE Open (October-December 2014): 1.

[94] Fargey, “The Deadliest Enemy,” 30.

[95] Clark, “Spanish Influenza,” 92-93.

[96] Kim, “Affect and Public Support for Military Action,” 10.

[97] Adolfo Garcia-Sastre and Richard J. Witley, “Lessons Learned,” 129-130.

[98] Julia Ries, “Here’s How COVID-19 Compares to Past Outbreaks,” Healthline, March 12, 2020; Jim Edwards, “Today’s Country-by-Country Coronavirus Stats: The US Struggles to Get Past the Peak as Australia and New Zealand Approach Zero New Cases,” Business Insider, April 23rd, 2020.

[99] Roge Karma, “Coronavirus is Not Just a Tragedy. It’s an Opportunity to Build a Better World,” Vox, Apr 10, 2020.

Sang Yoon (Fred) Lee
Sang Yoon (Fred) Lee

member of NLCS Jeju


Leave a Reply

You May Also Like
Read More

Is Genetics and Obesity Related?

Obesity is a result of chronic energy imbalance in an individual. People who intake more calories than how…
Read More


Are bacteria conscious? Consciousness, as a quality neither directly observable nor even able to be empirically proved, divides…
close up photo of brown hawk
Read More

Eye the gate of lux

Eyes allow species to visualise and detect others or terrains. Eye converts light from the outside world into…