The Covid-19 pandemic has precipitated a historical autopsy to unearth teachable lessons. While some characterize it as an unprecedented catastrophe against which we have few defenses, others are excavating the records of plagues past, certain that we must have learned something from them. The web is awash with unenlightening parallels: entire societies took to mask-wearing during the 1918 influenza pandemic, local authorities concealed information about the bubonic plague in 1665 London. I have to admit to reaching for the delete key during the last decade as each successive epidemic produced appeals to study the “lessons learned” from Ebola, Zika, the Black Death or even the Plague of Athens. But this pandemic demands an answer: Do governments actually “learn”? Have societies gleaned anything useful from their prior encounters with disease?

Trapped in the past

“Those populations, in turn, may discover that their governments cling to well-honed techniques of disease containment despite scientific evidence that they are ineffective.”

Of course, learning from history can mean different things. For some serious scholars of contagion like Peter Baldwin,1Cambridge: Cambridge University Press, 1999More Info → it entails recognizing that nations develop inimitable ways of being and doing, so that their experience with disease comes to shape the strategies they choose and their sense of the possible when confronted with new ones. If every infectious disease triggers a response molded by the institutions and practices formed in an earlier period then what we can “learn” from the past is that officials and politicians will inevitably resort to familiar measures, be they quarantinist and restrictive or largely civil libertarian in character. History in this sense is something akin to a deeply ingrained culture, or template, whose dictates it may not be possible to escape, and “learning” would mean leaders accepting that they cannot remake past practices or press new methods on populations accustomed to older ones. Those populations, in turn, may discover that their governments cling to well-honed techniques of disease containment despite scientific evidence that they are ineffective.

The US affection for “travel bans” is an example. In 1987, AIDS was added to the list of “dangerous and contagious diseases” with the passage of a Supplemental Appropriations Act. This effectively created a nationwide ban on the admission into the United States of individuals with HIV/AIDS. Despite arguments from medical experts that it was unenforceable and dangerous, the ban remained in place for 22 years. In 2014, congressional hearings saw the same demands for a travel ban against individuals from three Ebola-afflicted West African countries. In the end the United States settled for restrictions, forcing travelers from the affected countries to fly via US airports with screening procedures in place. The World Health Organization (WHO) cautioned against such travel restrictions for Covid-19, on the grounds that they were ineffective and counterproductive in the long run, but President Trump initially barred travel to the United States from China for non-US citizens, claiming that he was first to impose a “China ban.” This was factually inaccurate—the US ban was not the first, it exempted several categories of travelers, and many other countries joined the United States in imposing restrictions on travel from China, and, as time wore on, on one another. However, longstanding practice suggests that the United States is unlikely to abandon this strategy or change its approach to containing infectious disease at its borders.

In similar terms, a nation-state’s response may be constrained by its political institutions. John Barry, author of The Great Influenza about the 1918 Spanish flu pandemic, was reportedly uneasy about what George W. Bush gathered from his account: in the event of an avian flu pandemic (which seemed imminent in 2005), the United States should acquire the legal power to quarantine areas where outbreaks occurred and use the military to execute those powers. But what is interesting about the efforts to impose quarantines and other restrictive measures in 1918 is that they were rarely or intermittently effective. In the decentralized US federal system, the curtailing of activities and public gatherings and the closing of schools tended to vary state by state and sometimes even town by neighboring town. The lesson in this instance is that the powers granted to US states makes it extremely difficult for the nation to pull together when it comes to containing disease. The enduring significance of federalism has been apparent again recently as states found themselves trying to outbid one another for the purchase of protective equipment and ventilators.

“The lesson here is not that the Swedish approach toward Covid-19 was problematic, but that it should not surprise us, and that it may be difficult for the country to change course.”

Sweden exemplifies the effects of deep-rooted traditions on policies. Its efforts to manage Covid-19 have followed its “template” for tackling social problems. Instead of imposing the stringent lockdown of some other European nations, it has depended on the high levels of trust in civil society and the public authorities, characteristic of Swedish political culture, to persuade its citizens to adopt protective actions without closing down the economy. It has permitted daycare centers to stay open, freeing more people to go to work, and it did not close restaurants, bars, or parks. To some extent, this has been possible because Swedes are used to deferring to expert authorities in a nation where many government agencies have high degrees of autonomy. Decisions and briefings about Covid-19 have been made by public health bureaucrats and epidemiologists rather than by politicians. And alternative approaches are discouraged by a set of constitutional constraints that rarely permit the imposition of emergency measures. Initially, Sweden kept its death toll below that of some other European nations but now that toll has risen and the country’s strategies are coming under fire by some of its own experts. The lesson here is not that the Swedish approach toward Covid-19 was problematic, but that it should not surprise us, and that it may be difficult for the country to change course.

As this case suggests, longstanding legal arrangements also affect how government edicts are received in some regimes. We see this reflected in how concerns about privacy evoked by various strategies to contain Covid-19 have played out in different political and legal settings. The 2018 European Data Protection Regulation (GDPR), for instance, places stringent requirements on European Union member states with regard to the use and processing of personal data. If the United States had the kind of protections inscribed in this regulation, its residents might not feel so anxious about the prospect that mobile tracking devices may be used to trace the contacts of the infected.

At its limit, this version of history’s lessons suggests that, for better or worse, nations are prisoners of their pasts, Clio’s grip is unrelenting, and we cannot flip a magic switch to alter the underlying structural and cultural realities that shape a nation’s response to an epidemic.

Policy learning

But what are we to make of the other way in which history might be seen as relevant to the current epidemic, namely as a source of lessons on which decision makers might draw to shape their response out of a self-conscious accounting of past successes and failures? Large literatures on policy learning and diffusion explore the conditions under which officials may acknowledge mistakes and change course, model their strategies after those of other nations, or alter their practices in the face of new knowledge. Do we see any evidence of governments themselves learning from history in this way?

Here there is both good and bad news. Governments do learn from experience but, because history does not yield unambiguous lessons, different conclusions can be drawn from the same sets of events, and lessons can sometimes be learned too well. Take for example the growth of public understanding about zoonotic diseases. Many if not most human infectious diseases are thought to have originated as zoonoses, transferred to humans from animal herds and domesticated species. This did not attract particular excitement until books like Laurie Garrett’s The Coming Plague informed an unsuspecting public in the 1990s that a raft of such diseases lurked everywhere, threatening all nations. For the most part they were “vector-borne,” requiring an intermediary, an animal or an insect, to carry a pathogen to a human.

“The science of disease transmission was often uncertain, and without vaccines or other surefire remedies the initial response was to slaughter the carriers.”

As one epidemic followed another in the 1980s and 1990s, the world’s attention became focused on the suspected vectors. The science of disease transmission was often uncertain, and without vaccines or other surefire remedies the initial response was to slaughter the carriers. In the United Kingdom, cattle were infected with Bovine Spongiform Encephalopathy (BSE), or “mad cow disease,” in the late 1980s. The epidemic reached a peak in 1992 with 100,000 confirmed cases. At first, this was a disaster only for the cattle industry, but a human form of the disease, Variant Creutzfeldt-Jakob disease (vCJD), was identified in 1995, and became associated with eating BSE-contaminated meat. In an attempt to wipe out BSE, 4.4 million cattle were slaughtered. An outbreak of H5N1 (avian influenza) in Hong Kong in 1997 killed only six people but, since it was the first major outbreak among humans, the government’s response was to cull all chickens, ducks, and geese across the territory—in total 1.4 million birds. In April 1999, cases of febrile encephalitis, which caused at least 100 deaths, were attributed to the previously unrecognized Nipah virus in Malaysia. The apparent source of infection was exposure to pigs, and the subsequent slaughter of approximately 890,000 swine caused the near collapse of the pig-farming industry.2Looi Lai-Meng and Chua Kaw-Bing, “Lessons from the Nipah Virus Outbreak in Malaysia,” Malaysian Journal of Pathology 29, no. 2 (2007): 63–67. In 2004–2005, H5N1 returned and spread to eight countries in eastern Asia where it was transmitted to humans (mainly children) in Vietnam, Cambodia, and Thailand. By the end of 2005 it had killed more than 65 of 130 diagnosed people. There was little person-to-person transfer of the disease, but the fear that viral mutation might bring this about caused global panic and hundreds of millions of chickens and ducks were culled.

Various consequences have followed the recognition that animal and human health are interconnected. On the positive side, there is increased awareness that environmental mismanagement has had problematic effects, including encroachment of wildlife into human habitats. This has been one of the precipitating factors for the formation of the “One Health” movement, which argues for closer collaboration between the medical profession, veterinarians, and wildlife specialists to identify sources of new illnesses in animals and humans. Scientists believe that a bat is the most likely source of the novel coronavirus responsible for Covid-19, but that an intermediary animal transmitted it to humans. People infected early in the current outbreak worked in a live animal market in the Chinese city of Wuhan, but tests of coronavirus samples there have not identified a source. On the basis of genetic analysis, Chinese scientists have suggested that a pangolin was the prime suspect. This ant-eating animal is prized for its meat and what are believed to be the medicinal properties of its scales, and it is therefore protected, though illegally traded. But experts now think that, while pangolins are possible candidates, they “are not proven to be the key intermediary,” the identity of which may never be known.

However, speculation about the pangolin’s role in the epidemic has raised fears that they may be the next target for elimination. The SARS epidemic illustrates the dangers of such a rush to judgment. In 2004, after it was suggested that this earlier coronavirus was similar to one carried by civet cats, Chinese authorities ordered the mass slaughter of the animal, considered a local delicacy. A 2008 study subsequently identified bats as the source of SARS: “We see how the virus used different hosts, moving from bat to human to civet, in that order. So the civets actually got SARS from humans.” In the meantime, some 10,000 civet cats were drowned, electrocuted, and otherwise destroyed. In April 2020, handlers infected minks on Dutch fur farms with the novel coronavirus, SARS-CoV-2. The minks then passed it back to two people—the world’s first documented case of animal-to-human transmission in this pandemic. The Netherlands subsequently killed more than 500,000 minks from infected fur companies. Now, domestic and wild cats are on the radar as a potential reservoir for the virus. Eventually, the world will have to reckon with the impossibility of eliminating every animal species that can harbor pathogens harmful to humans.

“Societies have been slow to grasp that, without clear and adequate compensation guidelines, producers are often reluctant to reveal infection among their animals.”

The culling of animals to prevent disease has raised but not resolved the thorny question of compensation for families and farmers dependent on those animals for food or income. Societies have been slow to grasp that, without clear and adequate compensation guidelines, producers are often reluctant to reveal infection among their animals. The destruction of poultry to prevent the spread of avian flu had a devastating effect on small-scale commercial and backyard producers in Asia. Some governments did provide compensation for culled poultry, but it was often far below market value. Other countries, such as Cambodia, provided nothing to farmers, and people hid their ducks and chickens. The lack of an adequate compensation plan in the United Kingdom initially hampered efforts to determine the full scale of BSE. Farmers hesitated to admit they had BSE-infected cattle when they knew they would receive only 50 percent of their worth. Only in 1990, when BSE was made a notifiable disease and compensation was raised to the full market value, did more farmers come forward.

In short, some useful lessons have been drawn from the popular discovery of the zoonotic origins of disease, but not all of those “lessons” have been correct, and many have proved costly.

Lessons for the future?

Historians have remarked that the 1918 influenza pandemic was soon forgotten, a collective amnesia based perhaps on the fact that the catastrophic scale of suffering made it too painful to remember. There is no danger of that for this pandemic: when it is “over,” whatever that means, there will be no one who has not tweeted, broadcast, written, sung, blogged, or otherwise opined on Covid-19. But what will our recovery look like? What does history tell us about whether our societies will be marked and scarred?

Again, the lessons drawn from history have been mixed. Some commentators have looked back as far as the Black Death and drawn the bizarrely cheerful conclusion that plagues are good for workers, because wages rose in its aftermath. It is certainly true that manpower shortages after the 1346 plague increased the bargaining power of those who survived, but this overlooks the fact that a better living for a few was made possible by the death of around a third of Europe’s population. William McNeill, the master chronicler of plagues and peoples, notes that the positions in demand required relatively high skills. And the effects were not uniform across Europe; in some developed regions the economic effect of the plague was to produce “harsh collisions between social classes” as the socioeconomic pyramid was altered. Overall, his assessment is that the Black Death did not produce economic well-being and a sense of optimism. Instead, “darker climates of opinion and feeling became as chronic and inescapable as the plague itself.”3Plagues and Peoples (New York: Anchor, 1977), 180. Economists, too, have drawn pessimistic lessons from history based on the length and depth of economic stagnation in the wake of older pandemics and the increase in inequalities that have followed more recent ones in the twenty-first century.

“The painful irony in the United States is that this epidemic is occurring during a period when collective sentiments of unity and purpose have been severely eroded.”

It turns out that pandemics can inspire forgetting as well as remembering. The painful irony in the United States is that this epidemic is occurring during a period when collective sentiments of unity and purpose have been severely eroded. The effects of Covid-19 have been so overwhelming that we have all but forgotten that only a few months ago we were absorbed by a completely different “epidemic” linked to opioid addiction. Drugs were only one source of what was being described as a “colossal health crisis” based on “deaths of despair,” from alcohol and suicide as well as overdoses. Case and Deaton4Princeton, NJ: Princeton University Press, 2020More Info → document the many reasons for the despair: middle skill jobs vanishing overseas, the increasing number of people employed in the gig economy, and badly paid service jobs have pushed many into poverty. This situation has been particularly bad for white men without bachelor’s degrees in part because, as Helen Epstein puts it, they drew their self-worth and sense of community from their jobs. When those jobs disappear, “their sense of personal worthlessness can be profound.” Family life, too, has been eroding: marriage rates among young people aged 25 to 34 have been dropping among those with less than a high school education. This is not to say that marriage in itself is a bulwark against death and despair, but Emile Durkheim’s observation at the end of the nineteenth century that it protects men against suicide better than it protects women still seems to hold true.5Nick Danigelis and Whitney Pope, “Durkheim’s Theory of Suicide as Applied to the Family: A Empirical Test,” Social Forces 57, no. 4 (1979), 1081–1106.

As Richard Evans6Richard J. Evans, “Epidemics and Revolutions: Cholera in Nineteenth-Century Europe,” Past & Present no. 120 (1988): 123–146. documented, using the case of cholera’s ruthless sweep through Europe in the nineteenth century, epidemics tend to exacerbate the fault lines in societies. Ironically, it took this pandemic to reveal what social epidemiologists have long understood: disease follows a significant racial fault line in the United States. People of color have always faced greater threats to health. It remains to be seen whether the fact that African Americans are disproportionally infected, and dying from, Covid-19 will lead to policies that address the underlying causes of racial health disparities. These social divisions are not going to go away, because deaths from Covid-19 are being superimposed on deaths of despair and racial inequalities. Thus, in many ways, Covid-19 could not have come at a worse time for the United States. It arrives at a moment when the nation’s reserves of civility and solidarity were already under pressure, making it more difficult for people to look beyond their immediate needs to the common good.

What this means is that recovering from this pandemic is going to take more than a vaccine or the other remedies that medical science can supply. There are some small practical steps we could take: it is worth remembering that most nations did not publicize deaths from the 1918 influenza for fear of destroying the morale of a nation at war (Spain being neutral, was not so constrained, hence the name “Spanish flu”). In the service of bolstering the collective spirit, the media might begin to publish the numbers of those who are known to have recovered, instead of or alongside the relentless daily scorecard of deaths, both local and global. We could also work to ensure that funding streams are not diverted from enduring infrastructural issues, both social and material, at the root of existing health disparities—an all too common pattern in recent epidemics.

But it will require concerted efforts to repair the social fault lines that this virus intensifies. Shoring up the economy is obviously vital, but so is attending to the social fabric. As in many other respects, history does not tell us how to do that, but it warns us about the challenges to come. It shows us that epidemics are not merely matters of health, but social experiences with deep implications for the character of social relations, whether by virtue of the imagery they promote or the ways in which they distribute the costs of disease and reconstruction. Thus, even when the lessons it yields are unclear or contested, history is indispensable. In this case, it tells us not to hearken back to some unrecoverable past but to attend to the conditions that have led us to this point and left us unprepared in many ways to withstand and contain this pandemic. It is only by conserving and building on the social resources we possess, on which the capabilities of many people depend, that a route to recovery can be fashioned.

References:

1
Cambridge: Cambridge University Press, 1999More Info →
2
Looi Lai-Meng and Chua Kaw-Bing, “Lessons from the Nipah Virus Outbreak in Malaysia,” Malaysian Journal of Pathology 29, no. 2 (2007): 63–67.
3
Plagues and Peoples (New York: Anchor, 1977), 180.
4
Princeton, NJ: Princeton University Press, 2020More Info →
5
Nick Danigelis and Whitney Pope, “Durkheim’s Theory of Suicide as Applied to the Family: A Empirical Test,” Social Forces 57, no. 4 (1979), 1081–1106.
6
Richard J. Evans, “Epidemics and Revolutions: Cholera in Nineteenth-Century Europe,” Past & Present no. 120 (1988): 123–146.