Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Why Does False Information Spread So Easily?

Why Does False Information Spread So Easily?

 

Assume this appeared on your Facebook feed, posted by a friend who asks you to share it.   Soon, you get the same post four times in your Twitter feed.  Two who posted it are active in Democratic party politics. One volunteers for the Environmental Defense Fund. You’ve networked with these people before, and they’ve asked you to retweet the article to your followers and sign a petition on www.change.org.

Unfortunately, a lot of people would share such a post with others, assuming it’s true.  But there is no Dr. DuPres, the claim is bogus, and the website listed doesn’t even exist. 

Why do people accept as true and then share information that is false?  While some post and “push” false information deliberately for a political or issue agenda, many share false information unwittingly. Senator Ben Sasse, in his book Them, notes one reason why: “Only 55 percent of Americans spend more than 15 seconds reading an [online] article” and that’s why the “share” button is at the top of the article rather than at the end.”  Psychological research offers many additional explanations. 

 ·       Information Cascades: Sometimes we pass on information simply because everyone else seems to be doing so.  This gets especially seductive when we’re online. As Cass Sunstein notes in #Republic: Divided Democracy in the Age of Social Media, it’s particularly easy to encounter unchallenged false information when people are in social media “echo chambers” fostered by algorithms designed to feed them only what they want to hear. 

·       Reputational Cascades: As Sunstein also notes, sometimes we pass on information because we’re concerned about our reputation if we don’t.  Failing to do so may lead to criticism.  Think about how the four people in the false news story above might react if their friend said, “no, I won’t do what you ask.” That reaction can be more pronounced with political information; just ask people who have been “unfriended” by former social media friends.

·       Illusory Truth Effect:  Sometimes things seem true just because we encounter the information so often.  When you hear the same news multiple times over several hours or days, the psychological principle of fluency kicks in.  False news becomes familiar and easier to accept the more often you hear it.  A related variant of this problem is what psychologist Gordon Pennycook calls “reflexive open-mindedness” – the tendency to want to accept whatever we come across without applying critical thinking.  Pennycook’s research suggests we can hold off sharing false information if we stop ourselves for as little as 30 seconds to ask about its truthfulness, but that’s 30 seconds too long for many people.

·       Implied Truth Effect: Another problem is actually created by a good procedure adopted by some social media sites of labeling some news and posts as “not verified” or “disputed by 3rd party fact-checkers.”  When those labels are not applied, Pennycook finds some people assume the news must be true.  They don’t realize that site fact-checkers can’t scrutinize everything on their platforms.

·       Motivated Reasoning: While most people don’t purposely share information they know is false, sometimes they believe and share information because it fits with their politics or is taken as “proof” of their beliefs. The later is called confirmation bias – the tendency to search for information that fits with what we believe (and to discount information that doesn’t). 

·       Emotional Suppression of Rational Thought:  In their study of 126,000 contested tweets, MIT researchers found fake tweets generated emotions associated with anger and disgust while accurate tweets were more likely to be associated with sadness and trust.  Notice how the false story about Dr. DuPres contains a bold headline with an exclamation point, both meant to stir anger.  Emotions excite the brain’s amygdala which then fixes false information as memories in the hippocampus.  Other research shows some people hold onto false beliefs even after being told they’re false.  In this way, strong emotions suppress logical thought.   

“Falsehood flies and the Truth comes limping after it,” Jonathan Swift said, “so that when men come to be undeceived, it is too late; the jest is over, and the tale hath had its effect.”  We’re more likely to win the race for truth by using critical thinking and delaying our penchant for believability until we’ve applied it.

 (Note: The next ThinkAnew post will offer ways to spot fake news.)

Photo Credit: pixabay.com

Spotting Fake News Online

Spotting Fake News Online

Will Laws Just Become Suggestions?

Will Laws Just Become Suggestions?