Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Question 12: How Do I Avoid Disinformation and Fake News?

This online Facebook post after the Presidential Debate on October 22, 2020 was just one of many about this purported comment. One tweet alone got 50,000 “likes” and was shared 11,000 times, but President Trump never said “good”.  The exchange between Biden and Trump with moderator Kristen Welker actually went like this:

Welker: Let’s move on to the next section.

Biden: That’s right. And you — 525 kids not knowing where in God’s name they’re going to be and lost their parents.

Trump: Go ahead.

Trump was agreeing to move on to the next part of the debate.  He never said “good” about children being separated from their parents.  The spread of false information is a problem in any society.  This problem is magnified with the ubiquity of communication vehicles - cable, talk radio, websites and social media.  The challenge for thoughtful citizens is finding the truth on public issues – and that requires being able to spot falsehood. 

The Multiple Faces of False Information

Does 5G technology cause COVID?  That’s part of a popular conspiracy theory that many Americans believe and shared widely.  While there is no scientific evidence for this belief, many people shared it believing it’s true with no intent to deceive others. That’s spreading misinformation

During the campaign for the 2016 presidential election, more than 3,000 Russian-sponsored ads flooded Facebook aimed at dividing Americans and sometimes supporting Donald Trump.  One online group, called “AI” for “All Invaders,” posted false material designed to stoke Anti-Muslim hatred.  It had more than 183,000 followers.  The intentional spread of information known to be false is disinformation.

During his campaign for president in 2008, some influential conservatives pushed the claim that Barack Obama was born in Kenya and that his supposedly American birth certificate could not be found. This charge, which would have made him ineligible to be president, became a widely reported news story, often spread by those who knew it was bogus. When misinformation or disinformation is deliberately and widely spread on various platforms, it’s called fake news.

Fact Finder

What’s another “fake news” story circulating – and how do you know it’s false?


In recent years, “deep fakes” have emerged.  They use sophisticated software to create false images, videos and speech.  They were used, for example, to “show” Joe Biden endorsing Donald Trump for president in 2020. 

Artificial intelligence (A.I.) software can now create fake images from text prompts alone, such as this fake picture of Donald Trump being arrested in New York City.  This faked photo, created by Eliott Higgins of Bellingcat, got 6.4 million views when posted online, and people can have a hard time knowing it is bogus.

Artificial intelligence (A.I.) software can now create fake images from text prompts alone, such as this fake picture of Donald Trump being arrested in New York City.  This faked photo, created by Eliott Higgins of Bellingcat, got 6.4 million views when posted online, and people can have a hard time knowing it is bogus.

Faked Photo Created by A.I. Technology

Deep fake software is increasingly available to everyone. A mother in Bucks County, PA was charged with sending test messages to three other girls in her daughter’s cheerleading program that grafted their faces onto photos of women who were nude or drinking.

 “If we’re no longer operating from the same foundation of facts, then it’s going to be a lot harder to have conversations as a country.” - Dustin Carnahan, Michigan State University

Disinformation Spikes Cigarette Sales

(Credit: pixabay.com)

The Danger to Democracy

In December 1952, Reader’s Digest published an article, “Cancer by the Carton,” with evidence linking smoking and lung cancer.  Soon, U.S. tobacco companies saw a decline in cigarette sales.  In response they created the Tobacco Industry Research Committee (TIRC) which pledged to use eminent scientists to study the issue.  For decades, the TIRC misled the public on the dangers of smoking, sometimes by sharing out-of-context quotes from scientific papers and trying to discredit research confirming the dangers.   TIRC’s disinformation campaign sparked a sales rebound. Their approach has been compared recently to efforts by energy companies to combat measures to curtail the use of fossil fuels.  

Let’s Talk

How might disinformation about crime harm democracy?

Disinformation and fake news distort decision making. They make it hard to distinguish fact from fiction, leading many to believe falsehoods.  They lead some to refrain from forming any opinion because they don’t know what to believe.  A 2017 study found that 10-20 percent of Americans believed fake news stories they had seen about the 2016 presidential election and roughly double that number were not sure what to believe.  As the Washington Post said about Russian disinformation efforts: “They fling up swarms of falsehoods…. not so much to persuade people as to bewilder them.” It’s not surprising then that a 2019 Pew Research poll found 50 percent of respondents listed made-up news as one of the nation’s biggest problems.

“The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction and the distinction between true and false no longer exist.” - Hannah Arendt, The Origins of Totalitarianism 

How and Why Does False Information Spread?

(Credit: wikipedia.com)

On May 4, 2020, the video Plandemic was posted on Facebook, YouTube and Vimeo.  Featuring Dr. Judy Mikovits, it claimed a cabal was harming people and using COVID 19 to gain money and power.  On the morning of May 5th, a Facebook group dedicated to QAnon posted it to its almost 25,000 members.  By that afternoon, Dr. Christine Northrup, a vaccine skeptic, had shared it with her half-million followers, and by that evening Plandemic was on Reopen Alabama, a Facebook page with over 36,000 followers.  The video continued its viral spread on May 6th.  On May 7th, a BuzzFeed article highlighted falsehoods in the video and that same day both YouTube and Facebook removed it for violating their misinformation policies.  But the damage had been done.

This is not atypical.  A study by Soroush Vosoughi and colleagues of 126,000 stories tweeted by about three million people between 2006-2017 found that the top one percent of false stories reached up to 100,000 people whereas true ones rarely diffused to more than a thousand.  False stories, they found, spread not only wider but faster.

“Falsehood flies and the Truth comes limping after it.” - Jonathan Swift

As thinking citizens, we can prevent being taken in if we understand how false information spreads.  

  • The Distorting Power of “Authority” and Social Media: As Plandemic showed, once falsehoods are shared by media figures with a wide following, they spread virally, especially if these personalities are considered authorities.  Dr. Christine Northrup’s 500,000 followers no doubt included many who had hundreds of followers.  Yet an “authority” on social media is not necessarily an expert on the topics that lead to their posts (see Question #6 on how to judge expertise).  That’s the authority trap.

Fact Finder

What is an example of another falsehood that spread virally on social media?

The authority trap matters because of social contagion, the viral spread of behavior and emotions through a group.  Social contagion can be helpful as long as it is truthful, such as the “Ice Bucket Challenge” which encouraged people to pour ice water over their heads and pledge money to fight ALS.  Yet when social contagion spreads disinformation, as it did leading to the storming of the Capitol on January 6, 2021, it’s dangerous.

The Ice Bucket Challenge

(Credit: Major Tom Agency - unsplash.com)

  • Cascade Effects: Whatever the topic - climate change, racial discrimination, Russia or others - most of us are not experts. We’d prefer to form political opinions by listening to true experts, but it takes work to find them and digest what they say. Thus, many people follow the posts and views of those in their regular network – friends, family, co-workers and celebrities in the news and on social media.  We adopt and spread what they believe. This leads to an information cascade: we base our opinion on what information to share based on what others are sharing and so that view gains traction and spreads.

Even if people are skeptical of what their personal network says, it’s often psychologically easier to go along.  For example, if the posts of friends are convinced that an allegation of sexual harassment against a local politician is true, it takes courage to say “wait, I’m not sure I agree”. So the allegation gains force if no one wants to get criticized or ostracized for disagreeing.  That’s a reputational cascade:  we share information, even if we suspect it’s wrong, just to keep our reputation intact.

  • Partisan Polarization: Americans with strong party identities often approach political information with a partisan bias.  When it comes to fake news, Mathias Osmundsen and colleagues studied 2,300 American Twitter participants and how they handled fake news in a collection of 500,000 news headlines.  Their conclusion: “individuals who report hating their political opponents are those most likely to share political fake news and selectively share content that is useful for derogating their opponents.” 

In another study, researchers found that both liberals and conservatives are more likely to accept misinformation if it fits with their political ideology, especially if it comes from a source they consider credible.

Let’s Talk

Some 60 court cases decided that the 2020 presidential election was not “stolen” by voter fraud.  How did authority figures, cascade effects and political polarization feed disinformation about that election?

  • The Illusory Truth Effect: Sometimes things seem true just because they get repeated. In a study by Gordon Pennycook and colleagues, participants were presented with fake news headlines disguised as Facebook posts.  With repeated exposure to the headlines, including a week later, participants became more sure the headlines were accurate even when they were accompanied by the common social media disclosure “Disputed by 3rd Party Fact-Checkers”.  The researchers explain this using the psychological concept of “fluency.” Frequent repetition makes information easier for the brain to process and accept.

Pennycook also described the implied truth effect.  His research showed that when some of the headlines in a set of headlines are tagged with warnings about their accuracy, people assume those not tagged are accurate even though they may not be.

  • Emotional Suppression of Rational ThoughtIn their study of 126,000 contested tweets, MIT researchers found fake tweets generated emotions associated with anger and disgust while accurate tweets were more likely to be associated with sadness and trust.  Emotions (see Question #4) excite the brain’s amygdala which can fix these as false memories in the hippocampus.  That makes them hard to dislodge.  Emotional reactions also suppress logical thinking. 

So What? Now What?

 To avoid the dangers of disinformation and fake news, thinking citizens stay alert to these thinking traps:

  • Social contagion: the spread of emotions and/or behavior through a group

  • The Authority trap: deference to authority figures even if they lack expertise

  • Information cascades: sharing information simply because others are sharing  it

  • Reputational cascades: sharing information, even if we suspect it’s wrong, to keep our reputation intact

  • The Illusory truth effect: accepting something as true because we hear it often

(Photo Credit: roxanne-desgagnes-unsplash)

Assume you saw this on your Facebook news feed, posted by a friend who asks you to share it.  Within the hour, you get the same post four times in other social media. Two of those who posted it are active in Democratic party politics.   One is especially active in reducing global warming.  You’ve networked with them before, and they ask you to pass on the article to your network and sign a petition on www.change.org.

Based on what we’ve covered thus far, you’d already have some questions in mind: (1) is Dr. DuPres a respected authority?; (2) am I being ushered onto an information cascade?; (3) how is my reputation among friends dependent on whether I do what they ask?; (4) I’m seeing this headline a lot; is it really true?

Such questions are a first step, but what do you do next?  A regard for truth argues for the need to take three precautions:

  • Don’t Fall Prey to “Reflexive Open-Mindedness: That’s the tendency to be too accepting of possibly weak claims.”   A study at University College London suggests this is particularly prevalent for those at the political extremes who most want to believe information that supports their partisan views.  

  • Don’t Stop Your Search for Evidence:  Research suggests that once people hold a certain belief, they’re less interested in seeking more information, including information that might change their belief. Finding out if a photo is a deep fake, for example, can often be detected with a small investment of time.  Steps can include: (1) try to find and question the source and (2) look at the photo in the  highest resolution possible, then zoom in to see if parts of the image (e.g. hands and the background) are distorted.

  • Don’t Share Too Quickly:  Ben Sasse, in his book Them: Why We Hate Each Other and How to Heal highlights some things that contribute to this. ”Only 55 percent of Americans spend more than 15 seconds reading an [online] article,” he notes, and that’s why the “share” button is at the top of the article rather than at the end.  A Pew Research poll found that 23 percent of Americans admit to having shared fake news.

"When it comes to consuming news, we're miles wide and an inch deep." - Ben Sasse, Them

Once you commit to more careful analysis of information you get, a lot of steps are possible.  

  • Delay to Give Yourself Time to Deliberate: The 2021 Edelman Trust Barometer found that 52 percent of its survey respondents share or forward news items they find interesting, but of those only 25 percent have what they call “good information hygiene.”  They don’t avoid information echo chambers, verify information, and refrain from sharing unvetted information. 

Careful analysis of information takes time and delaying quick acceptance of potential disinformation pays dividends.  Gordon Pennycook found that even a 30-second video asking people to think about the accuracy of what they could share cut the sharing of fake news in half.  He also found that taking even just a moment to contemplate information made people three times more careful before sharing.   The headline about global warming above should encourage such a delay – as well as careful thought about its truthfulness

  • Use Fact-Checking Websites: In a January 21, 2022 interview, Comedian Bill Maher claimed that COVID booster shots were “useless” and could cause the immune system to become fatigued and weakened.  Being alert to the authority bias should make one immediately wonder what scientific expertise Maher (or “Dr. DuPres” in the fake news post above) has, but you could also check out his claim using search engines on objective and respected sites that devote considerable resources to checking the accuracy of information/news.  The table below lists some of these. 

Fact-Check Site                                              Focus of Site

www.wikipedia.org                                               carefully researched encyclopedia

www.factcheck.org                                                accuracy of political news

www.snopes.com                                                 debunking urban legends

www.factcheck.org/scicheck                                 science claims by partisans

https://mediabiasfactcheck.com                           media bias/deceptive news practices

www.poynter.org/teen-fact-checking-network        daily fact-checking for teens by teens

www.usafacts.org                                                U.S. government spending, revenue,                                                                                                      demographics and performance data

  • Apply Journalistic Fact-Checking Steps: Stanford professor Sam Wienburg’s research shows that professional fact-checkers adopt an important strategy when evaluating digital content.  Unlike most of us who read an online article vertically from the top down to evaluate its truthfulness, trained fact-checkers read laterally.  That is, they take off-ramps to additional websites as they read to verify the information they’re getting. 

The Verified Initiative of the United Nations, part of the UN Department for Global Communication, offers a number of suggestions for how to do this:

o   Make Sure the Website is Legitimate:  It should have an “about” and a “contact” page as well as bios and perhaps photos of its staff, both of which you can verify by another lateral web check.  Also check the URL.  Doing so, you’d find the site listed in the DuPres article above – nbcnews.com.co – is fake.

o   Look for Obvious Bias:  A focus on a specific political agenda or the use of stereotypes are warning signs to be careful.

o   Compare the Date of the Article with the Date of its Sources:  If the sources are very old, the article may mislead by using outdated or irrelevant information.

o   Check Out the Author(s):  Search online to find out whether they even exist and, if so, their expertise.  You can also do this with references cited in the article.

o   Cross-Check the Article Against Others:  If other reputable sources are saying something quite different be very suspicious.  You wouldn’t find any reputable sources reporting the claim by the fictitious Dr. DuPres.

o   Watch out for click-bait titles: Titles with highly emotional content, exclamation points, and words such as “blockbuster report” are “bait” designed to get you to click on and accept them.

There are some other obvious clues that you may be looking at disinformation: Does the title/article promise something catastrophic or too good to be true?  Does it show obvious bias?  Does it play to your biases?  Does it scapegoat individuals or groups?  Does it attack a person rather than the person’s position? Does it have grammatical or spelling errors? Is it designed to ramp up your emotions using wild claims or loaded language?  Our brains evolved to spot deviations from normality or we would not have survived as a species.  So we are prone to pay more attention to emotionally striking information.  

  • Learn Media Literacy.  Lateral reading tips are just part of the wider topic of media literacy, not just for websites but for all information sources.  There are a growing number of organizations (with websites and programs) that teach media literacy skills.  Some free sites worth exploring are listed below.

 o   News Literacy Project (www.newslit.org)

Improves the ability to consume news and information by spotting misinformation through using News Lit self-tests (e.g. “Should You Share It?”; “Vetting News Sources for Credibility”)

o   Civic Online Reasoning (cor.stanford.edu)

 Lessons, videos and assessments to teach students how to evaluate online information (e.g. “Sort Fact from Fiction Online with Lateral Reading”)

 o   Digital Literacy Project (https://www.learningforjustice.org/frameworks/digital-literacy)

K-12 student lessons and videos (e.g. evaluating online sources, digital tools for active citizenship)

  • Inoculate Yourself Against Disinformation and Fake News Through Pre-Bunking.  Facebook and Twitter use algorithms and fact-checkers to spot and remove disinformation and fake news. Some sites have warning labels for unverified content.  The Verified project highlighted above recruited 70,000 volunteers in 130 countries to spot and inform readers of disinformation about COVID.  Such debunking approaches aim to spot and correct false information, but it’s a herculean task to apply them to all of the sites and sources of misinformation, disinformation and fake news.  Pre-emptive or prebunking approaches are an alternative.  Their goal is to inoculate information consumers so they become resistant to false information. 

University of Cambridge professor of social psychology Sander van der Linden and colleagues developed Bad News, on online game to help people spot misinformation in tweets and headlines.  Tested on 14,000 participants, it improved their skill and that improvement seemed to last for some months.  A comparable version, Go Viral! was created to inoculate people against COVID falsehoods. 

Many of the thinking traps and techniques to address them covered earlier apply as well in defending against disinformation and fake news.  These include avoiding emotional hijacking (Question #4), bogus claims of expertise (Question #6), false, implanted memories (Question #8), biased framing of issues (Question #10) and of course conspiracy theories (Question #11).  As we’ve seen here, a lot of social pressure is often applied to get us to agree with what turns out to be false information.  In Question #13, we’ll focus on how to avoid just going along with the group. In Question #14 we’ll look at how to analyze political ads, and in Question #15 we’ll focus on how to detect political lies.