Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Question 9: Am I Willing to Change My Mind?

(Credit: commons.wikimedia.com – Andrew Dunn)

In early 1965, the United States had about 50,000 American troops in South Vietnam acting primarily as advisors to the South Vietnamese army.  By the end of 1965, 385,000 U.S. combat troops were engaged in fighting what had quickly become an Americanized war.  This striking buildup, which reached a peak of 543,000 in April 1969, came as a result of decisions by President Lyndon Johnson.  Yet Johnson had been advised multiple times that adding massive American forces would not lead to victory. He acknowledged to himself that he was being sucked into a war that “Asian boys” should fight but was also determined not to be the first president to lose a war.

In 1964, Johnson had asked Bill Moyers, his informal Chief of Staff, to gather advice on how he should proceed in Vietnam, but as recounted in historian Barbara Tuchman’s The March of Folly, Johnson would resist Moyers’ counsel and escalate the war:

 "The Moyers network, initially created at Johnson's request for contrary views, proved too uncomfortable for the President, who did not like "dissonance" or having to face multiple options.  He shared the problem if not the flash of insight of Pope Alexander VI in his one moment of remorse when he acknowledged that a ruler never hears the truth and "ends by not wanting to hear it."  Johnson wanted his policies to be ratified, not questioned, and as the issues hardened, he avoided listening to Moyers' reports."

Johnson’s resistance to changing his mind created the quicksand that would, in the end, cost over 58,000 American deaths and his decision to not seek a second term as president. 

“Yes,” we’d all say, “I’d change my mind on a public issue or candidate given good reasons.”  Yet many never find those reasons, as if they are stuck in their own political quicksand. Between 2018 and 2020, a poll by Pew Research found nearly 90 percent of Republican and Democratic voters kept their party loyalty.  A 2004 study by Yale’s Alan Gerber found that efforts to change voters’ choice of a candidate through mailed literature produced just a 0.2 percent switch.

Certainly, we resist changing our minds when our views reflect strong values and thoughtful analysis.  Yet even then, thinking citizens must be open to different views and to deliberation and change.  That’s central to civility and democracy.  

 “The only person that likes change is a wet baby.” – Anonymous

Fact Finder

Is there evidence that our genetic makeup affects our willingness to change our political views?

The Power of the Endowment Effect

A Bernie Sanders Campaign Rally

(Credit: vidor nordli mathisen @ unsplash.com)

“Bernie!  Bernie!  Bernie!”  Many people have a favorite politician, contributing money, volunteering and cheering at rallies. No matter what negative facts emerge about their politician, their devotion is steadfast.   Brain science helps explain why this may be so.

Participants in an experiment were given a coffee mug they were told cost $6 and asked what they would accept to sell it.  The average: $5.25.  Other participants were asked what they’d pay to buy the same mug.  The average: $2.75.  Behavioral economist Richard Thaler, who conducted this research, named this the endowment effect.  We endow what we possess with value, making it worth more in our minds than it is for those who don’t have it.  

The endowment effect shows up in everyday life.  When we go to sell the house we’ve lived in for years, we get upset at lowball offers.  This is our home!  It holds so many memories. We’ve put so much into it. Yet to potential buyers, it’s just a house

As with mugs and houses, so it is with politicians and policy views.  Once we “own” them, their value to us increases.  It’s hard to give them up.  Just the act of choosing endows our choices with value.  In one study, participants were shown vacation spots and asked to rate each. Their brains were scanned to observe the level of activation of the dopamine reward system (associated with pleasurable activity). Then, researchers took vacation spots rated exactly the same and created pairs from which participants were asked to pick which vacation they’d take.  Scanning their brains again, researchers found activation of the reward system increased from its earlier level for vacations participants chose and decreased for those they did not, even though both had been rated the same earlier.

We endow what we create with value too.  In the 1950s, Pillsbury sold a cake mix that just required water, assuming busy people would appreciate it.  Sales were disappointing.  So they changed it to require the maker to add eggs, oil and milk.  Sales took off.  When making the cake required no personal effort, it was hard to feel good about it.  Dan Ariely, who recounts this story in The Upside of Irrationality, used this as an example of “The Ikea Effect” - we love our Ikea furniture because we assembled it. 

Ideas are creations, especially when they’re our ideas.  As Ariely recounts, in a psychology experiment participants were asked to look at a problem and propose a solution.  Others were asked to look at the same problem but evaluate a solution given them.  In every case, people rated their own solution as more practical, likely to succeed and worth more of their time and money than a solution someone else gave them. 

So those devotees of Bernie Sanders (or any other candidate) are steadfast in part because of the endowment effect.  They chose their candidate, made his views their own and worked on his behalf.  They don’t need to call upon on System 2, careful, deliberative thinking (see Question #4).

Let’s Talk

Describe a time you changed your mind about something you were convinced was correct.  How did this happen and why?  What can you learn from this experience? 

The Status Quo Bias and Sunk Costs

Rongjun Yu and colleagues at Cambridge and University College London involved participants in a gambling task where they could choose to “stay” or “gamble” and then find out whether they won or lost. Using functional Magnetic Resonance Imaging (fMRI) of the brain, they found that when people chose to “stay”, a pleasure center in the brain was activated but that when they took a chance, the part of the brain associated with anxiety, fear and disgust was activated.  They also found subjects who gambled and lost felt worse than if they stayed put and lost.   Such brain reactions are at the heart of the status quo bias

The study’s authors speculated it may be more comforting to stay put because we feel more responsible if we’ve made the wrong choice.  If switching turns out badly, we might also castigate ourselves for “what might have been.” 

Another possibility was suggested in research by Cornell’s James Cutting.  In an initial study, he found that students who looked at pairs of paintings preferred famous impressionist paintings to nonfamous ones. It could have been because the famous paintings were just better, so in a second study, he showed students a nonfamous impressionist painting four times for every famous one.  Later, when asked to choose their favorite painting in fifty-one pairs of a famous and nonfamous one, they chose a nonfamous one in eighty percent of the pairs. His concluded that we prefer what’s familiar, which may be one reason we’re reluctant to change our minds in politics.  Others suggest we stay with the status quo because it satisfies our needs for permanence and closure (see Question #5).

A cousin to the endowment effect and status quo bias is the sunk costs bias.  People who’ve made an investment may stick with it so they don’t feel it’s been wasted.  In a study of how to help smokers quit, participants were divided into two groups.  One group (A) was promised an $800 reward if they were still not smoking after the six-month smoking cessation program.  The other group (B) had to deposit $150 to enter the program.  They would get that deposit back plus another $650 if they were not smoking after six months.  If they had started smoking again, they lost their deposit and, of course, got no reward. Note that those in Group B could only earn $650 since the other $150 was their own money.  The success rate was double for Group B compared to Group A.  Those who coughed up $150 didn’t want to lose their money.  The sunk cost bias is at the heart of loyalty programs.  Those who pay membership fees to Amazon or Costco do a lot of business there because “I’m already a member.” 

 [The]“longer we hold a belief, the more we have invested in it; the more publicly committed we are to it, the more we endow it with value and the less likely we are to give it up." -  Michael Shermer, The Believing Brain

 

(Credit: portlandcopwatch.org)

The sunk cost bias was a contributing factor in the FBI’s Brandon Mayfield fiasco. On March 11, 2004 coordinated bombings against Madrid’s commuter train system killed 191 people and wounded 1,800.  Partial fingerprints on a bag containing detonating devices were shared with the FBI, who identified them as belonging to Mayfield ("100% verified").  On April 13, Spanish authorities told the FBI their examination of Mayfield’s fingerprints did not yield a match to the partial from the bombing.  Insisting on their correctness, the FBI sent an examiner to Madrid on April 21st to explain its conclusion.  The FBI arrested Mayfield on May 6, having sunk so much effort and reputation into their investigation.  On May 19, the Spanish National Police informed the FBI it had positively identified the fingerprint as belonging to an Algerian national named Ouhnane Daoud.  The court released Mayfield to home detention the next day and on May 24 the FBI withdrew its identification of Mayfield.  The ensuing lawsuits resulted in a formal apology from the government and a $2 million settlement.

The sunk costs bias also appears in political attitudes.  In a study by Jonas Kaplan and colleagues, political liberals were asked if they agreed with certain statements on political (e.g. abortion) and non-political topics (e.g. vitamin use).  They were then given contradictory arguments to their professed beliefs.  Brain scans showed that they reacted with emotion and disgust when asked to reconsider their political views.  They were more willing to reconsider non-political views, and the brain did not register the same reaction when doing so.  

The sunk costs bias, as we saw with President Johnson, is also evident in American presidents who escalated our commitment in the Vietnam War despite years of evidence that it was unwinnable as well as in the twenty-year failed effort in Afghanistan. 

Let’s Talk

Why does it take a very long time for public opinion on such issues as global warming to change? 

Cognitive Dissonance

In 1954, social psychologist Leon Festinger and colleagues joined a doomsday group convinced it had a message from “The Guardians” on planet Clarion that a catastrophic flood would destroy the world on December 21st of that year.  Group members quit their jobs, sold their possessions and awaited the fateful day.  When it came – and went – without the predicted disaster, their leader claimed they’d been spared by “the force of God and light.”  Group members decided that the failure of the prophecy could be explained since December 21st was just a practice session.  They continued to spread their beliefs with greater fervor.  Festinger later labeled this phenomenon cognitive dissonance.  When confronted with inconsistency between facts and our ideas, thoughts or beliefs, we feel compelled to close the gap.  Admitting we’re wrong is one solution.  But often we come up with explanations that allow us to continue holding the false belief.  We may even proselytize for it, as did the Doomsday group.   

Smokers Face Cognitive Dissonance When Presented Evidence of Its Dangers

(Credit: Lika Malic @ unsplash.com)

Fact Finder

What are other examples of cognitive dissonance?

A belief is a complex set of neural connections that form a module in the brain. The more we reaffirm that belief (those connections), the stronger it gets.  This is helped by dopamine, which is released when a belief is pleasurable.  To get that dopamine rush again, we may seek confirming evidence of our belief.  A challenge to the triggers the insula, which is associated with fear and disgust.  To deal with that, we may seek a way to cling to our belief.  The doomsday group did both.

 “A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point.”

- Leon Festinger, A Theory of Cognitive Dissonance

In The Political Brain, psychologist Drew Westen reports on a study he conducted during the 2004 campaign between George W. Bush and John Kerry.  Participants were given a statement that a candidate made and then a second candidate statement directly contradicting the first.  After reflection, participants were asked to rate whether they thought the candidate contradicted himself, from a low of 1 (strongly disagree) to a high of 4 (strongly agree).  Participants had no trouble seeing a contradiction for the candidate they did not like (average rating near 4) but had trouble agreeing their preferred candidate contradicted himself (average close to 2).  Scans of participants’ brains showed that neural circuits involved with logical reasoning were depressed and those involved with positive emotions turned on.  As Westen concluded, the partisan brain “worked overtime to feel good, activating reward circuits that give participants a jolt of positive reinforcement for their biased reasoning.” 

How such rationalization happens was suggested by David Perkins in his book Informal Reasoning and Education.  He gave participants social issues, such as whether spending more on education would improve teaching and learning, and asked them for their initial judgment.  Then, each participant was asked to write all the reasons they could think of on either side of the argument.  Perkins scored each reason as a “my side” or “other side” argument.  People came up with far more “my side” arguments, and the higher their IQ, the more “my side” arguments they generated. 

So What? Now What?

 Thinking citizens stay alert to these thinking traps:

  • Endowment effect: we value strongly what we own

  • Status quo bias: we prefer things as they are, being reluctant to change

  • Sunk costs: we sink more into a commitment when we should abandon  it

  • Cognitive dissonance: we adjust our thinking/actions in unhelpful ways to eliminate gaps when met with contrary facts

People do change their minds.  During the Constitutional Convention, some delegates demanded a statement of rights guaranteed by the new form of government.  James Madison resisted, arguing that if some rights were spelled out in the Constitution, the implication would be that rights not named would not be protected.  He carried the day.  Yet as a newly elected Member of the First Congress, Madison proposed and even wrote amendments to the Constitution which we know today as “The Bill of Rights.”  He changed his mind, dealing successfully with the thinking traps we’ve discussed.  Specific action steps can help us do so.

  • Use Logic to Decrease the Fear of Changing.  Madison came to see that his fear was not justified.  As we saw in Yu’s “stay” or “gamble” betting experiment, the fear of losing triggers the emotional center of the brain and the status quo  bias.  One way to counter the fear of changing our mind on public issues comes from Cognitive Behavioral Therapy.  People who suffer from paralyzing anxiety are asked to “catastrophize” - to list all of the things they worry will happen if they do what they are anxious about.  Then, they estimate the actual likelihood any of them have happened or will happen.  This helps them realize their fears are greatly magnified and frees them to begin to make changes in their lives. Madison came to see that his fears were magnified and unlikely to come to pass.

  • Envision Positive Outcomes from Changing.  In her book The Influential Mind, neuroscientist Tali Sharot reports that showing parents research the MMR vaccine does not cause autism doesn’t change their resistance to vaccination.  “In fact,” she writes, “by repeating the myths regarding the MMR vaccine . . .  people sometimes wind up remembering the myths rather than the counterevidence.”  “When an established belief is difficult to weed out,” she notes, “seeding a new one may be the answer.”  Showing parents how the vaccine prevents disease is thus more likely to work. 

    Sharing the story of Minnesota resident Mark Korin with others is a case study in how this can work.  He refused vaccination and ended up spending two months hospitalized with COVID.  “You’re talking to a guy who should be dead,” he said after battling the disease.  “If I had taken the vaccine, I believe that I may have gotten sick, I probably wouldn’t be in the hospital.  If I was in the hospital, I probably wouldn’t be on the ventilator.”

Thus, if you find yourself resisting rethinking your views on a public issue, put your fears off to the side and think of possible benefits of seeing it differently. Madison realized a statement of rights would strengthen commitment to the Constitution among its opponents and foster more fidelity to those rights because they’d now be enshrined in law.

 “There’s nothing more dangerous than an idea when it’s the only one you have.”

-       Emile Chartier, French philosopher

  • Increase Exposure to Other Ideas.  There’s little likelihood of changing your mind if you can think of nothing else or if you have the illusion of knowledge (see Question #5).  Deliberately exposing yourself to how others think about a public issue is important.  Another classic case of this was Nelson Mandela’s behavior while imprisoned for 17 years on Robben Island.  He could have deepened his hatred of Afrikaners and emerged from prison more embittered and still intent on violent change.  Instead, he learned their language, befriended many of his jailors and gained a deeper understanding of their fears. While he would continue the fight to abolish apartheid, his newly acquired knowledge changed him from advocating armed resistance to negotiating a political transition.  In time, he earned the respect of Afrikaner President F.W. DeKlerk, who he eventually replaced.  So: 

 o   Get to Know Your Political Opponents:  Meet them on a personal level, over a cup of coffee during which politics is not discussed.  This can help build a relationship in which you – and they – are more open to each other’s views.

 o   Gain More Knowledge: Read books, articles and surf websites with innovative ideas on issues.  Get out of your “comfort” zone by listening less to pundits whose views you can predict. 

 o   Surf the Internet Anonymously:  Search engines allow this.  It shields you from algorithms designed to feed you similar content to what you’ve looked at before.

 o   Visit Places: Travel outside your ideological and geographic communities.  If you do, you can encounter different people and ideas. Mark Twain said “travel is fatal to prejudice, bigotry and narrow-mindedness.” 

Fact Finder

What are some ways to have a good, non-defensive conversation on a political topic with someone who has a different view?

  • Encourage Dissent.  Political scientist Phil Tetlock learned a lot from training “superforecasters” – people whose success at predictions is well above average.  He learned they invited criticism of their ideas from others.   Most of us don’t like arguments, which is why we avoid hearing criticism.   State convention ratification debates for the Constitution were an essential and classic demonstration of the power of airing dissenting views.  In Virginia, Madison debated Patrick Henry, a staunch opponent of the new Constitution.  Their differing views gave that state’s convention a clear picture of the strengths and weaknesses of the document – and helped change Madison’s mind on the need for a bill of rights.

  • Defend a Different View.  In an intriguing experiment by researchers at Lund University, participants filled out a survey about the extent to which they agreed with 12 political views. After that, researchers deliberately and surreptitiously changed participant’s stated extent of agreement on two of the items on the survey and fed the survey results back to them.  Later that day and a week later participants were then asked to state their views a second time. Their views shifted in the direction of the manipulation, suggesting the views were not as solid as they may have thought.  As the researchers concluded, “people have a pretty high degree of flexibility about their political views once you strip away the things that normally make them defensive.”

One approach to free up political views is that used in training debaters.  A debater may be assigned a position to defend that contradicts her own view.  This forces seeing the other side of an issue.  Afterwards, she may stick with her original opinion, but this mental exercise is useful throughout life.

  • Avoid Being Too Certain.  In a study by researchers at MIT and Emory University, nearly 3,000 respondents were asked, on a scale of 0 to 100 percent, how certain they were of their political beliefs.  Nearly a third (31.4 percent) of “extremely Left-wing” and 40.4 percent of “extremely Right-wing” respondents reported being “absolutely certain” compared to only 6.8 percent of all others.  Being “absolutely certain” closes off the mind to the possibility of change and using any of the strategies described here.

Democracy’s advance depends on openness to change.  Jefferson noted that as the mind “becomes more developed, more enlightened, as new discoveries are made, new truths discovered and manners and opinions change, with the change of circumstances, institutions must advance also to keep pace with the times.”  One force aimed at changing our minds is “public opinion.”  Question #10 focuses on that force.