If you were trying to picture a Misinformation Spreader, your first step probably wouldn’t be to look in the mirror. But in our work examining the psychology of misinformation, we discovered something surprising: not only are we all susceptible to becoming Misinformation Spreaders, but the methods many of us are using to stop the spread of falsehoods could be inadvertently propagating them.
Addressing this behavioral blindspot is urgent. With days left until Election Day and millions of Americans already heading to the polls, election officials, journalists, social media companies, and concerned citizens are working overtime to tighten the misinformation spigot. But without a deeper understanding of how people start to believe and share misinformation, many well-intentioned people could unwittingly accelerate the flow of falsehoods.
Without a deeper understanding of how people start to believe and share misinformation, many well-intentioned people could unwittingly accelerate the flow of falsehoods.
It’s vexingly easy to be a spreader of misinformation, but becoming someone who instead inhibits its spread is intuitive, too, once you have a better understanding of the role that psychology plays. Below, we share three research-backed tips to be a smarter sharer of election information on social media—the equivalent of wearing a mask or physically distancing online.
Focus on the facts. Don’t repeat misinformation, no matter what.
Let’s take media outlets as an example. Journalists are dedicated to finding and sharing facts. And yet, even well-intentioned reporters can inadvertently spread misinformation, even as they seek to raise awareness around its dubiousness. For instance, the Associated Press produces Not Real News, a weekly roundup which consists of “some of the most popular but completely untrue stories and visuals of the week. None of these are legit, even though they were shared widely on social media.” Another common strategy to squelch misinformation, used by all kinds of communicators, is myth-busting, where myths are often bolded and highlighted, and the facts are detailed less colorfully (and prominently) below. Even more pernicious are the statements that attempt to negate a myth, such as “you shouldn’t vote twice.”
When misinformation circulates, it’s tempting to repeat the same falsehood, interact directly with the post, or to harvest it for an eye-catching tweet. In both cases, these good intentions can backfire. These strategies risk bumping the misinformation up in social media algorithms and exposing more people to the lie, or exposing them more than once. And that repeated exposure can be dangerous: behavioral science research suggests that people are more likely to believe, and also share, false claims after hearing them once—or even remember them as true. A sense of familiarity can be mistaken for veracity.
Instead of staying anchored to myths (which we know spread faster than facts) or attempting to negate them, rewrite posts and communications to focus on the facts, avoiding any repetition of falsehoods.
Instead of staying anchored to myths (which we know spread faster than facts) or attempting to negate them, rewrite posts and communications to focus on the facts, avoiding any repetition of falsehoods. In other words, rebuild a narrative around the truth. “You shouldn’t vote twice” becomes “you should vote one time.” Another example is this guide to voting at home developed by the National Vote at Home Institute with our support. The guide acknowledges misinformation without repeating it, explaining that “there are many myths about the vote-by-mail process, and information voters hear from politicians may not always be true. The truth is that voting by mail is safe, secure, and reliable.”
Make specific recommendations. Don’t share vague warnings.
Another well-intentioned strategy that can backfire: general warnings about misinformation. Research suggests that when voters lack a clear understanding of where misinformation can come from or what it might look like, they begin to distrust all sources of information. Instead, communicators can get more specific, including in their messages about the specific falsehood they’re warning against.
For instance, a warning about misinformation around when and where to vote on Election Day could be replaced with posts that clearly share—you guessed it—the facts on where and when to vote on election day. They can also go beyond the misinformation warning by sharing where voters can find facts, or by sharing tools to help voters recognize not only that misinformation exists, but also how it works.
This tweet from the city of San Rafael, California, which pairs a warning about misinformation with a link to where voters can go to find the truth, is a great example. So is this game, created by researchers at the University of Cambridge and the U.K. government, which shows people how misinformation metastasizes, and how all of us are vulnerable to spreading it.
Aim for repetition and redundancy. Don’t try to vary your messaging.
Part of learning how to fight misinformation more effectively could also mean putting certain social media best practices on a brief hiatus. Chances are, if you’re a social media user, you’ve been taught to vary the framing and language of your posts even if you’re saying something similar. Creativity is king, and drives more engagement. But in the fight against misinformation, redundancy reigns.
The key is repeating the facts—with the same language—over and over, across different platforms. One easy way to do this: retweet official information from election administrators who don’t have large social media followings. That way, you can help reduce the chances that voters will get conflicting messages from different sources.
Most of us, of course, are not Misinformation Spreaders. But some of us may be more like a little kid who sneezes in the face of an adult: they aren’t trying to spread their germs. They just haven’t learned that most crucial form of etiquette: cover your nose.
It’s not too late for all of us to learn how to be more responsible stewards of information online. The health of our democracy depends on it.
Full disclosure: Elizabeth Weingarten is Managing Editor of the Behavioral Scientist and a Senior Associate at ideas42, one of the Behavioral Scientist’s Founding Partners.