Viral Load

There have been anti-vaccine movements for as long as there have been vaccines, but our current information ecosystem leaves all of us vulnerable to misinformation. Brown’s Information Futures Lab aims to arm public health students and practitioners with next generation communication tools to meet the growing information crisis and its public health impacts.

Vaccine uptake across the country had slowed down much sooner than anticipated, and especially in communities of color, vaccination rates were rapidly falling behind the national average.

This troubling vaccine inequity was the result of many obstacles—lack of access to pharmacies, and to time off from work to get shots—as well as a history of mistreatment and mistrust. But there was something else at work, too: While authorities used mostly outdated top-down communications approaches such as press conferences, media interviews and broad campaigns, vaccine misinformation was flourishing online. Social media channels provided fertile ground for manipulative, emotionalized and hyper-targeted content to dominate the conversation and confuse millions. People of color were especially targeted with disinformation that used historic examples of racism in healthcare to increase peoples’ doubt in the safety and efficacy of vaccines.

To understand the barriers to vaccination faced by communities of color, Houston in Action, a network of community organizations and grassroots leaders in the city of Houston, Texas, held a series of listening sessions. The goal: to hear directly from their community members—and to tailor responses and communications strategies accordingly.

Next to identifying structural barriers such as job, food and housing insecurity, the Houston in Action team noticed the deep impact misinformation had had on the community: Participants for example cited concerns about side effects, including infertility and other stories going viral on social media. “I saw the chip on the news,” said a participant in one listening session. “It can only be seen through a microscope, and I saw it.”

False reports about microchips, infertility, magnetism and other fears circulated widely during the vaccine rollout in 2021—the latest crisis in a growing “infodemic” enabled by new communication technologies that has made it increasingly difficult to parse fact from fiction, often with deadly consequences.

To boost vaccine uptake, Houston in Action partnered with members of the Information Futures Lab (IFL), a new initiative at the Brown School of Public Health working to mitigate the health impacts of misinformation and other forms of information disorder. Claire Wardle Ph.D. and Stefanie Friedhoff, faculty in the Brown School of Public Health and co-directors of the IFL, say that the lab’s name reflects the forward-facing goal of the team: to address the information crisis that threatens to undo major achievements in public health.

In 2003, when Sars-CoV-1 threatened a global pandemic, there was no iPhone, no WhatsApp, no Facebook and no TikTok. Twenty years later, the ways in which people engage with and make sense of the world — our information ecosystem — has changed.

Stefanie Friedhoff co-director of the Information Futures Lab; associate professor of the practice of health services, policy, and practice
 
Friedhoff

“Technology has dramatically changed how information is shared and how people make sense of the world,” says Friedhoff, a former journalist who helped lead the Nieman Foundation for Journalism at Harvard University. “In 2003, when Sars-CoV-1 threatened a global pandemic, there was no iPhone, no WhatsApp, no Facebook and no TikTok,” Friedhoff says. “Twenty years later, the ways in which people engage with and make sense of the world—our information ecosystem—has changed.”

Misinformation is as old as human communication, but in recent years there has been increasing awareness of the growing threat it poses. In 2016, misinformation and disinformation played critical roles in the election of Donald Trump and the British “Brexit” campaign. In 2018, “misinformation” was declared “word of the year” by an online dictionary. Unlike misinformation, which is transmitted by people in good faith, disinformation is spread knowingly and deliberately. In practice, however, this distinction often collapses. What may begin as Russian disinformation gets repackaged on Facebook and reappears as one of many justifications for your uncle’s unwillingness to get vaccinated.

The team at the IFL wants to prepare public health students and practitioners to meet the growing information crisis and its public health impacts. Three years ago, the decision to launch such an initiative at a school of public health might have raised eyebrows, but the COVID-19 pandemic has made the health consequences of misinformation tragically apparent. A study conducted with Microsoft AI Health and authored by Friedhoff, associate professor of the practice of health services, policy and practice, found that vaccines could have prevented around a third of U.S. deaths due to COVID since February 2021, when vaccines became widely available.

The astonishingly quick development of COVID vaccines was the result of an unprecedented effort by scientists, but there was no corresponding effort to develop new and necessary communication strategies to help people understand the safety and importance of getting vaccinated. As a result, by late spring 2021, vaccine supply exceeded vaccine demand in many parts of the United States.

Reframing the Challenge

For Wardle, professor of the practice of health services, policy and practice, the flood of false vaccine reports in 2021 was only the latest front in a longstanding battle. “I’ve been working in the misinformation space for over a decade,” says Wardle, who grew up in the United Kingdom before completing a Ph.D. in journalism at the University of Pennsylvania. In 2015, Wardle founded First Draft, a nonprofit coalition working to stem the tide of misinformation on social media. “We’ve had a long time to describe the problem, to say ‘it’s a really big problem.’ We need to start working on ‘what does the future look like?’”

“To make progress, we need to reframe the challenge,” says Friedhoff. “Misinformation and its impact on social stability are a downstream effect of upstream design and policy choices. We need to figure out how to make our information spaces align with our democratic values. It will take time, but we can’t just stick our heads in the sand.”

Person speaks to a group
Participants learning about app data collection and surveillance at an IFL-produced HestiaLabs Sandbox this fall. Photo: Kung Chen.

Communities confronting poverty, racism, or language obstacles can be particularly vulnerable to disinformation actors. In 2017, an outbreak of measles struck the Somali-American community in Minnesota after the community was targeted by anti-vaccine activists, including the author of a rescinded study linking vaccines to autism. There have been anti-vaccine movements for as long as we have had vaccines, but changes to the information ecosystem have left all of us more vulnerable to misinformation. In 2014, an outbreak of measles linked to Disneyland was attributed to growing “vaccine refusal” among visitors to the park from many different communities whose views on vaccination had been shaped by the rapidly evolving, transnational information system.

The rapid adoption of different social media platforms in recent years demands new approaches to content moderation and other editorial questions, and platforms and regulators have struggled to keep up. “It’s like we’re all driving Ferraris, but cars were only invented 20 years ago,” says Stefanie Friedhoff. “There are no rules, there are no traffic lights, no seat belts, no airbags. That’s how we’re managing our information ecosystem.” In this dizzying and dangerous environment, people often don’t know who to trust. When individuals, regardless of their credentials, are able to share information instantly with their followers at the click of a mouse, public health authorities are often reduced to one slow-moving voice in the clamor. Without the accountability of centralized editorial structures, claims can be revised repeatedly, and new evidence asserted instantaneously, so that fact-checking becomes a game of whack-a-mole at broadband speed. Fact checking can even have the opposite effect, drawing additional attention to spurious claims.

“It’s a wicked problem,” says Wardle, using the term for policy challenges that resist solutions because of their social and practical complexity. “Everybody in 2016 was like, ‘Can’t we just tweak the Facebook algorithm? Can’t we just change Section 230?’ But there’s no magic bullet. It’s going to take 30 years.”

Infrastructures of Trust

It can seem like misinformation actors—the clickbait self-promoters, trolls, Facebook sharers—have the upper hand. But one way to catch up to misinformation actors is to learn from them. “They play to people’s experiences and emotions, they make it engaging, they make it relevant, they make it about your real concerns and issues,” Friedhoff says. “So it’s not farfetched to say that we want this type of engagement with public health messaging.”

To be effective in this rapidly evolving information environment, next generation public health communications need to marry form and function to find new ways to reach people where they are. This is a key goal of the IFL: to help practitioners design and implement interventions that bolster our resilience to misinformation.

The attitudes and beliefs that influence individual health behaviors are social phenomena, shaped by our relationships, identities, history and social context. Empowering local leaders to communicate effectively to counter misinformation is not only more effective in that community than Twitter fact checks, but can build more robust immunity to a wider range of misinformation, and other health threats.

Since the vaccine rollout began, surveys show that Black people have consistently become more inclined to get vaccinated faster than white Americans, showing up for more first shots, and making up a growing proportion of Americans protected by vaccination.

One reason for the growth in vaccine uptake in Black communities is the work of community and place-based organizations, many of whose members have shifted work to mitigate the impacts of the pandemic since it first arrived, including responding to misinformation. Partnerships like the Equity-First Vaccination Initiative (EVI), a collaboration between the Brown School of Public Health and the Rockefeller Foundation which Friedhoff helped to lead, work to share evidence of vaccine safety and efficacy, and to support community organizations in sharing approaches to combating misinformation. The IFL team published a report of “promising practices” that reflects fundamental principles of public health communication including harm reduction and engaging with people where they are.

Starting a Dialogue

In the old “deficit model” of information sharing, public health authorities saw themselves as the sole source of health information for their constituents. In this model, community members are viewed as empty vessels to be filled with the information necessary to bring about behavior change. Under the deficit model, misinformation could be addressed through fact-checking and the provision of additional, evidence-based information. But social media, and other innovations in communication technology, mean that misinformation is produced faster than it can be debunked. “The deficit model never really worked,” says Friedhoff. “It was just convenient for those delivering it. It doesn’t work in the classroom, it doesn’t work in public health communications. That’s just not how human beings really engage with information, and it’s not how behavior change works.”

Rather than tackle individual instances of misinformation, the IFL and its partners are adopting public health approaches to managing the effects of misinformation, through upstream innovations designed to slow its spread, and downstream interventions that bolster individual and community resilience to misinformation.

Leveraging what Friedhoff calls the “dialogue model,” the IFL will support public health partners in navigating decentralized, multi-directional information flows. They will teach partners to tailor messages depending on the relationship between source and audience. The dialogue model recognizes that facts are scientific, but information is socially produced, in dialogue between different interpretations, understandings, and worldviews, whose standing is inherently situational.

The social dimension of information is both a vulnerability and an opportunity. By explicitly linking information to public health, we can invoke a sense of social responsibility in our information behaviors, similar to how we think about pollution or responsible driving. Doing so can foster resilience by empowering the public at large to be more skilled consumers of information, to check their own facts, and to be more responsible stewards and sharers of information.

New Tactics

As a first step, the IFL team is working to understand how well existing misinformation countermeasures work. Early in the pandemic, Twitter began rolling out a feature that prompted users to read articles before they retweeted them—a “friction” intervention that works like a speed bump on a road. Twitter reports that users were 33% more likely to open articles, and has rolled out a similar nudge asking users to reconsider tweets with offensive or abusive language. These voluntary tweaks by platforms like Twitter are encouraging, but are limited by platforms’ reluctance to share key data or collaborate more directly with researchers and policymakers. “The platforms are doing their own research and testing and interventions. But what we get from them is what they tell us,” says Rory Smith, research and investigation manager at the IFL. “There are no independent audits going on, which would be incredibly helpful in terms of what has worked, what has not worked, why has it worked, why has it not worked?”

The IFL is taking first steps to reach the communities most vulnerable to misinformation and conspiracy theories. One potentially promising approach would work like a vaccination, teaching people to recognize misinformation and respond effectively by exposing them to harmless examples. The University of Cambridge’s Social Decision-Making Lab has developed games in which players must use misinformation strategies—mimicking authorities, and playing on fear and emotion—to build online influence. These games mirror the effect of mRNA vaccines, which work by prompting our bodies to produce the spike protein of the COVID-19 virus, alerting our immune system so that it can recognize and fight a real virus. Similarly, by encouraging individuals and communities to use the tools of misinformation, games and other interventions could build resilience to actual misinformation.

The IFL’s first Sandbox event brought together leading thinkers to support the rollout of the European Commission’s Digital Services Act. Future events will bring together researchers, policymakers, and practitioners to collaborate on other innovative approaches to the information crisis. In spring 2023, the IFL will host its first class of fellows, for six months of intensive mentorship and collaboration directed towards developing practical policy interventions, and fostering a global community of public health practitioners tackling the misinformation crisis at all levels. Mentored by information veterans like Mark Scott, Politico’s Chief Technology Correspondent and a visiting fellow at the IFL, the IFL fellows will have an opportunity to trial real-world interventions in collaboration with community partners.

The IFL team is also piloting “inoculation” interventions, including a simulation with Brown students, who were asked how to respond to viral images of passengers being carried off an airplane by people in hazmat suits. Examining different possible responses let students see how social media platforms are designed to encourage engagement, not accuracy or responsible reporting. Likewise, an IFL trial in Kenya demonstrated that a text message-based course on emotional resilience helped reduce people’s inclination to share misinformation.

Faced with such an urgent problem, the IFL team plans to iterate on these and other interventions as quickly as possible. “We’ve been playing catchup for too long, chasing after different false claims and rumors,” says Wardle. “It’s time we start focusing on how to make our information spaces healthier, safer, and more effective so people can find what they need to make decisions that impact themselves and those they love.”