The Battle Between Disinformation and Fact-Checking: Who’s Winning?
On May 4, 2020 a 26-minute video titled Plandemic was posted on Facebook, YouTube and other sites. It contained a number of unproven charges about the scientific community and public leaders, such as that the flu shot can cause COVID-19 and that wearing a mask can actually cause people to infect themselves with their breath. The video included an extensive interview with Dr. Judy Mokovits, a retrovirus scientist who already had a published paper retracted by the journal Science. By the morning of May 5, a Facebook group supportive of QAnon shared this “Exclusive Content, Must Watch” video with its 25,000 members. By that afternoon, Christine Northrup, a physician and vaccine skeptic with nearly 500,000 Facebook followers also shared it. By that evening, Plandemic entered the political arena as it went viral on sites critical of vaccines and government. On May 7, BuzzFeed posted an article citing flaws in Plandemic and that same day the video was removed from Facebook and YouTube.
While this would seem a success story, in which fact-checking took down disinformation, it’s a failure story too. It took three days to debunk the video. By that time it had spread to millions. This time lag continues today. While there are nearly a dozen respected online fact-checking sites, they have to choose among all of the disinformation flooding the Internet and gather evidence before they can publish. As Mark Twain put it “a lie can travel halfway around the world while the truth is putting on its shoes.” Even sites such as Facebook and Google that have their own fact-checking mechanisms will often alert users to disinformation but not remove it, allowing users to continue sharing if they wish.
Even if the time lag was minutes not days that would help only if three conditions are also met. First, fact-checking resources are sufficient to address at least all of the most egregious false claims. Second, people are actually seeking facts that can debunk disinformation. Third, people are rational actors who given facts are willing to change their minds.
Two data points suggest the extent of the disinformation problem. As of February 2023, 42 percent of Americans reported seeing false information about politics in just the last week. As Internet traffic has grown, so has the disinformation problem. NewsGuard, which tracks disinformation and source credibility, found that in 2019, 8 percent of 8.6 billion tracked news interactions came from dubious sources. By 2020, 17 percent of 16.3 billion interactions came from dubious sources (2.8 billion posts). NewsGuard also found that 800 websites use Artificial Intelligence to create stories, with scant human supervision, and that these stories often contains false information and fabricated events.
Disinformation and fake news are now also propagated by phone, flyers, political campaign speakers, ads, and small and large group meetings. Arrayed against this volume are just 11 U.S. firms (e.g. FactCheck.org, Politifact.org), some state efforts and the staff and algorithms of platforms such as Google, Facebook and YouTube and small print and broadcast sources, all of which have limited resources.
Even with the limited supply of fact-check stories, we don’t have reliable data on how many people actually seek them out and read them. If Plandemic was viewed and shared by millions, how many actually saw factcheck.org’s rebuttal? Some research suggests that liberals are more aware of and likely to use fact-checking sites than conservatives and that while Republicans may support the idea of fact-checking; they don’t want to apply that to Donald Trump.
Of course the bottom-line question is whether those who read fact-checking stories actually change their minds when facts don’t fit with what they think. In a provocative paper, the University of Queensland’s Luke Munn argues that a core assumption of fact-checking efforts must be challenged: we are not rational and ethical automatons. He notes instead that human thinking is powerfully governed by emotions, factional social groups and prejudices. How we feel, what in-group we identify with and what biases we have impact how we approach information and our willingness and ability to deal with disinformation. For example, a study by Marius Boeltzig and colleagues found that we’re much more likely to accept biased information from people we like.
Many other cognitive distortions can lead people to reject fact-check information, such as the illusory truth effect in which we come to believe false information because it has been repeated so often we assume it must be true. Cognitive dissonance is another. When confronted with information that disagrees with our beliefs, such as the 1950s finding that smoking causes cancer, some people will double down, refusing to change their thinking because they don’t want to change their behavior. They may even resort to another thinking mistake, called confirmation bias, in which they consciously search for information to discredit the fact-check, a process aided for decades by the tobacco industry as it fed disinformation for those anxious for it.
“In a democracy, Eleanor Roosevelt said, “no one else does our thinking for us.” Disinformation wants to convince us we don’t need to listen to that obligation of citizenship. Fact-checking resources are one tool to meet that obligation, but they likely will never be enough.
(If you do not currently subscribe to thinkanew.org and wish to receive future posts, send an email with the word SUBSCRIBE to responsibleleadr@gmail.com)
Photo Credit: shmonitor.org