For those who want to avert those consequences, it makes sense to try and correct misinformation. But as we now know, misinformation—both intentional and unintentional—is difficult to fight once it’s out in the digital wild. The pace at which unverified (and often false) information travels makes any attempt to catch up to, retrieve, and correct it an ambitious endeavour. We also know that viral information tends to stick, that repeated misinformation is more likely to be judged as true, and that people often continue to believe falsehoods even after they have been debunked.
Instead of fighting misinformation after it’s already spread, some researchers have shifted their strategy: they’re trying to prevent it from going viral in the first place, an approach known as “prebunking.” Prebunking attempts to explain how people can resist persuasion by misinformation. Grounded in inoculation theory, the approach uses the analogy of biological immunization. Just as weakened exposure to a pathogen triggers antibody production, inoculation theory posits that pre-emptively exposing people to a weakened persuasive argument builds people’s resistance against future manipulation.
Just as weakened exposure to a pathogen triggers antibody production, inoculation theory posits that pre-emptively exposing people to a weakened persuasive argument builds people’s resistance against future manipulation.
But while inoculation is a promising approach, it has its limitations. Traditional inoculation messages are issue-specific, and have often remained confined to the particular context that you want to inoculate people against. For example, an inoculation message might forewarn people that false information is circulating encouraging people to drink bleach as a cure for the coronavirus. Although that may help stop bleach drinking, this messaging doesn’t pre-empt misinformation about other fake cures. As a result, prebunking approaches haven’t easily adapted to the changing misinformation landscape, making them difficult to scale.
However, our research suggests that there may be another way to inoculate people that preserves the benefits of prebunking: it may be possible to build resistance against misinformation in general, rather than fighting it one piece at a time.
In other words, instead of having to inoculate against every example of misinformation, our idea is to design interventions to build resistance against misinformation in general by focusing on the most common manipulation techniques used in misinformation. Prior research has shown that unveiling manipulation attempts and making people aware that they are vulnerable to attack is a good way to generate resistance to this kind of malicious persuasion.
For this purpose, we, at the Cambridge Social Decision-Making Lab, developed a series of free online games, in which players learn about various manipulation techniques. Research shows that games can be highly beneficial for achieving educational outcomes, and significantly enhance learning through engagement. Following inoculation theory, exposing people to a “weakened dose” of misinformation in a safe (gaming) environment induces a sense of threat and an awareness of how vulnerable one’s attitudes might be. The subsequent opportunity to learn by doing lets players engage in a simulated exercise that helps cultivate psychological resistance against manipulation techniques. So instead of telling people what to believe, we created these games to equip players with the skills necessary to identify, argue against, and prevent harmful misinformation from going viral.
So far, we have launched three separate games, in collaboration with DROG and Gusmanson Design, each with a different theme: Bad News, Harmony Square, and Go Viral!. Since there’s no better way to inoculate yourself than to walk a mile in the shoes of someone trying to dupe you, individuals play from the perspective of the bad guy. In our first game, Bad News, players are exposed to weakened doses of six common misinformation techniques—for example, how to use emotional buzzwords like “horrific” or “terrifying” to increase the viral potential of their content.
Bad News has been translated into 19 languages and has been played over a million times around the globe. Harmony Square was produced with the Department of Homeland Security and is specifically about election misinformation. Your job as a player is to reduce the peaceful community of Harmony Square to metaphorical rubble by mounting a disinformation campaign aimed at dividing its residents and fuelling intergroup polarisation.
Instead of telling people what to believe, we created these games to equip players with the skills necessary to identify, argue against, and prevent harmful misinformation from going viral.
Go Viral! is a 5-minute game designed in collaboration with the U.K. Cabinet Office (with support from the U.N. and W.H.O.) and specifically targets COVID-19 misinformation. The game covers three techniques commonly used to spread misinformation about the virus: fearmongering, using fake experts, and coming up with conspiracy theories. By mastering these tactics and carefully curating their own (fictitious) group of online “truth tellers,” players watch their conspiracy theories spiral into fictitious nation-wide protests. Depending on the choices players make throughout the game, they receive a final score that shows how well they performed in comparison to other players. By encouraging players to share and challenge others, Go Viral! is an easily scalable intervention that can immunize millions of people around the world against the techniques used to spread misinformation.
We recently conducted a large experiment to test the effectiveness of Go Viral! as a tool to build psychological resistance against COVID-19 misinformation. This study is currently under review, but the results were in line with our previous work, which demonstrated that people find manipulative social media content significantly less reliable after playing our fake news game. We’ve also found that playing these inoculation games boosts people’s confidence in their own judgements and reduces people’s self-reported willingness to share false content with their network. However, unlike most biological vaccines, we’ve discovered that without regular “booster shots” (e.g., retesting or replays), psychological inoculation decays over time. We also note that while our interventions help people spot fake news, this does not automatically imply they always help people identify high-quality, credible news (as even news that does not contain any misinformation can still vary widely in its quality). Lastly, an open question is how to best reach (vulnerable) subpopulations who will benefit most from the interventions.
Prebunking doesn’t need to be restricted to games. For example, TED-ED recently released an animated video based on our research, explaining how to detect misinformation. Similarly, Twitter recently applied prebunking to warn all U.S. users against election misinformation that they might come across on their platform. Just like any other threat to your well-being, the key is to proactively prepare yourself before you face misinformation in the wild. As the world embarks on a COVID-19 vaccine launch, make sure psychological inoculation is part of your 2021 starter pack.