Coronavirus and the Danger of Disbelief

Why was it so hard even for doctors to see what was coming?

Randi Hutter Epstein
Image of COVID-19 virus. Graphic by Bianca Ibarlucea.
Graphic by Bianca Ibarlucea

In late February, I was at a dinner party on the Upper East Side of Manhattan. The media had reported the first death from coronavirus in the United States only two days earlier. Naturally, the conversation turned COVID.

One of the guests, a librettist, mentioned that he heard that 40 to 70 percent of the world would become infected. An economist rattled off doomsday statistics and predicted imminent lockdowns. Then the host, an author, turned to me—I have an MD, though I’m not a practicing physician, let alone an infectious disease expert—and said, “Let’s hear what the doctor has to say.”

I can’t remember my exact words, but I know I offered a sanguine assessment of the situation. An uptick of cases, perhaps. But not quarantines, school closures, or economic collapse.

Weeks later, in my restless quarantine half-sleep, I’d rehash my comments, cringing at the memory. How could I have gotten it so wrong? I had never been on the frontlines of disease fighting, but I knew far too much about the history and epidemiology of pandemics to give such a misguided prognosis. It was almost as if the woman speaking with such confidence that evening was a different person from the one who had learned about infectious diseases in medical school. It was as if I lived in two realities: an intellectual self that could digest and dispense data, and an emotional self that couldn’t grasp—couldn’t take in—the import of that information.


In the 1980s,
when I was a medical student at Yale, I spent many afternoons talking with Drs. Robert Shope and Wilbur Downs, professors of epidemiology and public health, and renowned virus hunters. We’d pack brown bag lunches and sit around a conference table down the hall from the microbiology labs, home to a vast collection of viruses. The doctors were filled with stories of adventure and lessons in public health, lessons I was eager to learn. They taught me that more soldiers were sidelined during World War II from germs than from battle injuries. They told me about tracking down Lassa virus in Nigeria, then studying the deadly germ back in a New Haven lab where a few escaped droplets nearly killed a colleague. They taught me that, ironically, a victory in a public health campaign can be the reason for its demise. In the 1950s, in Sri Lanka, for instance, an anti-malaria project reduced the number of afflicted people from more than a million to fewer than twenty. Funding agencies incorrectly deemed malaria vanquished for good, so aid was cut. Within a decade, the germ resurged. The same thing happened in India and Bangladesh.

I came away from these discussions with an understanding of the urgent need for local teams around the world to surveil continuously for lurking infections. Upon discovering an outbreak, these teams would alert a global network of experts who could help prevent the germ’s spread.

In 1992, not long after I graduated from medical school, Dr. Shope published an alarming National Academy of Sciences report, “Emerging Infections: Microbial Threats to Health in the United States.” The report, which, once again, warned about the increasing possibility of global pandemics if precautions were not taken, got a lot of attention in the media. In one news conference, Dr. Shope warned that the world was still vulnerable to “something along the line of the 1918–1919 influenza pandemic that killed 20 million people worldwide.”

In the aftermath of an Ebola outbreak in the mid-1990s, pandemics became fodder for literary blockbusters with Laurie Garrett’s bestselling The Coming Plague (1994) and Richard Preston’s The Hot Zone (1995) that was turned into the Hollywood film Outbreak, starring Dustin Hoffman and Morgan Freeman. And just as Dr. Shope predicted, new viruses emerged: one after another, the world saw outbreaks of H5N1, West Nile virus, SARS, H1N1, MERS, Ebola (again), and Zika.

But none of this translated into a sustained policy response in the United States.

Clinton, who held the presidency during the H1N1 and West Nile epidemics, created the National Pharmaceutical Stockpile (now the Strategic National Stockpile), but it languished under the administrations that followed. George W. Bush fired his biodefense czar during his first year in office (only to hire him back mere months later to lead bioterrorism preparedness programs after the anthrax attacks following 9/11). Obama infused funds for a vaccine after H1N1 struck, but they weren’t widely available until after the peak of the outbreak. In response to criticism, Obama’s administration created a 69-page “Playbook for Early Response to High-Consequence Emerging Infectious Disease Threats and Biological Incidents.” The Trump administration threw it out. Overall, the past twenty years saw a gutting of each initiative put in place to protect us.

Yet, even given all this, I’m embarrassed to admit, I assumed that if a dangerous germ arrived on our shores, we’d have the infrastructure and organization to mobilize accurate testing, administer safe and effective treatments, and produce and distribute vaccines. Though I knew about the threat of a pandemic—and about our fractured and reactive healthcare system—I had also read of triumphs. With vaccines, we conquered smallpox and nearly eradicated polio. We developed drug cocktails that turned HIV infection from a death sentence to a chronic, undetectable disease. In the early days of coronavirus, these were the memories that surfaced in my brain. They were more comforting than the story my intellect told.


The philosopher William James
wrote that our concepts of the world are based not only on knowledge but also on our own lived experiences. He believed that no one grasps the full and true complexity of reality; each of us creates our own version without fully comprehending everyone else’s. “We mutilate the fullness” of reality, he wrote, by confusing it with our own piecemeal experiences.

Perception is based on experience; we accrue memories by soaking in the sensory details of the world around us. These details seep into our brains and get categorized according to facts (stored in the hippocampi, seahorse-shaped organs on each side of the brain) and emotions (stored in the amygdala at the tip of the hippocampi). Data (the hippocampus stuff) without emotions (the amygdala part) is less sticky. If there’s no memory, there’s nothing to conjure up. Many of those in Wuhan, during the initial days of the outbreak, could recall the traumas of SARS, which hit Southern China in 2002. That’s what propelled Li Wenliang, a Wuhan ophthalmologist, to post warnings on social media of a potentially new and dangerous virus. (Dr. Li was then criminalized for speaking out and died weeks later of COVID-19.) It also may have helped East Asian countries respond more quickly and successfully to the pandemic than their Western neighbors. Recalling the emotional burden of the past shaped the way they responded to the present.

Similarly, our lack of experience with pandemics shaped how America responded—or failed to respond. Certainly, it shaped how I responded. The disbelief that results from a lack of experience is different, I think, from doubt: James called doubt the “true opposite” of belief. Because it happens when the mind is in unrest and struggling to perceive a complicated world, it’s related to denial, which the twentieth-century psychoanalyst Anna Freud characterized as a way of blocking out unpleasant realities. I wasn’t a pandemic denialist, nor did I doubt the facts. But without an image-bank to draw on, I was—like many others—merely living in a state of suspended belief.

Of course, the hedging language of medicine may also add have added to our collective incredulity. Although most of us crave certainty, scientists use qualifying words in the conditional tense, such as “may” or “most likely.” They never promise. The ifs and buts of the medical discourse around an emerging virus leave a lot of wiggle room for an optimist like me. Maybe the bad stuff won’t happen after all; it’s “probable”—not “guaranteed.”

These days, when I Zoom with doctor friends, I hear a lot of “we told you so’s.” I nod, but I feel like a fraud. On a recent episode of NPR’s “Fresh Air,” John Barry, the author of The Great Influenza—a gripping narrative of the 1918 pandemic chronicling the many ways that both doubt and disbelief left an earlier nation unprepared—told Terry Gross that “intellectually understanding it is one thing and having it hit you is something quite different.” When asked if he expected it, he replied, “Yes, and no.” And I felt solace.


A doctor once told me that everything is clearer with a “retrospectoscope”: that it’s much easier to see where things went astray in the past than to speak with any kind of certainty about the future. When it comes to public health issues, we may never be able to cultivate that full sense of reality, an ideal that William James considered unattainable anyhow. But we do need to make decisions based on data, shelving our personal sentiments of disbelief.

We need systems to assess the information and to prepare an infrastructure to combat diseases. We need to fine-tune our scientific communication so people believe the experts and heed their advice. Whether we can emotionally prepare may not matter. Whether we are wearing masks because we are afraid of spreading disease to loved ones or out of an abstract sense of civic duty may not matter. What matters is that we learn the lessons of a larger history—that we believe in them and prepare accordingly, even if we haven’t lived through them.

I might be overthinking my own incapacity to grasp what was about to hit America. Last week, I emailed my friend who hosted the dinner party. Perhaps I wanted to make light of my errant predictions; perhaps I wanted to apologize. He remembered the opera composer reciting statistics, but not the dire words of the economics professor. And, to my relief, he had no recollection of my comments—luckily, they hadn’t made an impression.

I’ve been walking around my neighborhood taking photos: masked shoppers on masking tape stripes stuck six feet apart outside store entrances; masked and gloved Duane Reade employees behind blockades of plexiglass and empty crates. I’ve shot the “LIMIT: 2 EACH” placards leaning on toilet paper stacks and also the shuttered stores with apology signs.

I want a visual record. I worry that what was once shock will become the new normal and vanish from my amygdala. Photographic proof may jiggle a few neurotransmitters. The act of taking pictures, I hope, will sear the feeling of the empty streets into my brain cells. I want to commit this time to memory so that it might inform my perceptions of the future.

I want to remember that I was warned, but I was stunned all the same.

Randi Hutter Epstein is Writer in Residence at the Program for Humanities in Medicine at Yale School of Medicine, a lecturer in the English Department at Yale College and an Adjunct Professor of Journalism at Columbia University Graduate School of Journalism. She is the author of Get Me Out and Aroused, both published by W.W.Norton. She lives in New York City with her husband and quarantining with three of four grown children.
Originally published:
June 16, 2020

Featured

The Shapes of Grief

Witnessing the unbearable
Christina Sharpe

Writing in Pictures

Richard Scarry and the art of children’s literature
Chris Ware

Garth Greenwell

The novelist on writing about the body in crisis
Meghan O’Rourke

You Might Also Like

Cannon Fodder

A doctor on the front lines
Laura Kolbe

There’s a Sickness Outside

A doctor reflects on how dread and anxiety can become their own pandemic
Nitin Ahuja

Subscribe

New perspectives, enduring writing. Join a conversation 200 years in the making. Subscribe to our print journal and receive four beautiful issues per year.
Subscribe