Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Question 4: How Do We Make Political Decisions?

Black Lives Matter Protest, Denver, Colorado

(Credit: Colin Lloyd –unsplash)

On April 17, 2021 protesters marched thru downtown Denver under the banner of “Black Lives Matter (BLM),” one of many such marches across the nation in the past few years. Most Americans have views on the BLM movement, but how do we come to those views?  How do we decide if we want to protest – on this or any other topic? Why do some resort to violence during these protests while others do not?  How does the brain help or hinder us in thinking about these things?  Political decisions have consequences for us, our communities, the nation and often the world.  So how do we decide?   

(Credit: stress.lovetoknow.com)

Let’s start with this photo.  What thoughts/feelings would enter your mind if this person was facing you?  Answer this question before reading further.

Now answer this question: What does 17 x 24 equal?

Most likely, the thoughts/feelings about the angry person came quickly and the math problem took longer. Here’s why.

We Draw Upon Two Systems for Thinking When We Make Political Decisions

In Thinking, Fast and Slow, Nobel Laureate Daniel Kahneman suggests we have two ways we think.  One is automatic; it produced the quick reaction to the finger-pointing man.  The other is slower; we have to use it for questions that require careful thinking, like math problems.

We need both systems. What Kahneman calls System 1 takes very little energy which is important because the brain’s energy is limited.  It’s great for routine decisions, using our experience of how the world works.  If you had to meticulously weigh all options that might explain this angry man, he might have left you flat on your back before you finished.  Instead, System 1 has a model in your head that allows you to act without much if any thought.  System 2 is slow and not automatic.  It must be turned “on.” It takes time, energy and mental concentration. 

One thinking mistake we can make is using System 1 when System 2 is required.  The captain of the Titanic made just this error.

Sinking of the Titanic

(Credit: commons.wikimedia.org)

We know the Titanic hit an iceberg.  But most people don’t know that the captain had received iceberg warnings but sailed west at full speed anyway instead of south to avoid them.  He dismissed concerns because he had never struck an iceberg before.  Since seas were calm, he also assumed it would be easy to spot an iceberg when, in fact, calm seas meant there would be few ripples from the iceberg that might have made it easier to spot.  The captain used System 1: he relied on his gut instincts based on years of sailing experience.  Yet those years did not include enough experience dealing with icebergs.  He used System 1 when System 2 was needed.

Multi-Car Pileup

(Credit: cp24.com)

Let’s Talk

Test your understanding of System 1 and 2.  Why do pileups like this happen on a rainy/foggy day?

Even politicians at the highest level can fall into the thinking trap Kahneman warns us about.  “I rely on myself very much,” Donald Trump once said.  “I just think you have an instinct and you go with it.” The president was a firm believer in the power of his (System 1) “gut” thinking.  Yet it failed in his initial approach to COVID 19, when System 2 was badly needed. He compared it to seasonal flu, saying on February 26, 2020: “When you have 15 people and the 15 within a couple of days is going to be down close to zero, that’s a pretty good job we’ve done.”

There are no physical brain regions mapping neatly to Systems 1 and 2, but the distinction tells us we must apply the correct decision making process in the situation we face.  For example, you can rely on System 1 to drive to your polling place.  (It draws on the procedural memory we discussed in Question #3.)  Yet you need System 2 to make a thoughtful decision about your vote.    In some cases, as we’ll see, you can make use of both systems.

System 1 biases pose barriers to thinking citizenship.  Yale’s Geoffrey Cohen asked self-described liberals and conservatives to react to two welfare policy proposals.  One offered benefits much better and the other much worse than existed.  Unsurprisingly, liberals preferred the more generous and conservatives the less generous option.  Then, with new groups of partisans, he presented the same proposals but labeled the more generous one Republican and the less generous one Democratic.   Conservatives preferred the more generous and liberals the less generous one.  All denied party preference impacted their decision.  Using System 1, each group reacted to the “Republican” or “Democratic” label instead of evaluating the proposals using System 2.  

System 1 Relies on “Mental Models” – Which Can Help But Also Block Political Thinking

System 1 is very good at using our mental models (sets of beliefs) about how the world works.  Our experiences and learning create these models, and we draw heavily on them.  We have, for example, mental models of how to eat well and be a good parent.  Our models may not be perfect, but in most life situations they’re pretty helpful.  Yet some mental models cause problems in our social and political lives.  For most of our history, the mental model of marriage was that it must be between a man and woman. It did not include gay marriage.  It took decades of personal stories of gay lives, protests and rational arguments before most Americans rethought that mental model (using System 2). 

Vietnam Veterans Memorial, Washington, D.C.

(Credit: Carol Donsky Newell)

Years after the Vietnam War, one of its chief architects, Defense Secretary Robert McNamara wrote a memoir, In Retrospect, examining his mistakes. He acknowledged a flawed mental model: “We viewed the people and leaders of South Vietnam in terms of our own experience,” he said.  “We saw them in a thirst for - and a determination to fight for - freedom and democracy.” In other words, America’s leaders saw a small, weak country struggling for its freedom, just like us in 1776.  We made a similar mistake in invading Iraq, convinced we’d be greeted as liberators restoring democracy only to be treated as colonizers.  Mental models embedded in System 1 thinking sometimes need to be challenged by System 2.

Let’s Talk

What mental models guide liberals?  conservatives? When are these models helpful?  harmful?

In Political Decision Making, We Must Integrate Reason and Emotion

Benjamin Franklin

(Credit: Google Art Project)

In 1772 Benjamin Franklin answered a letter from British chemist Joseph Priestly who sought help in how to make a decision.  Franklin shared his approach. He would, he wrote, fold a paper in half, creating two columns which he labeled “pro” and “con.” For a proposed decision, over several days he’d list all the reasons he could think of in each column.  Then he’d weigh them against each other, crossing out pros and cons of equal weight.  What remained was his “logical” conclusion, reached through his “Moral or Prudential Algebra.” 

We often think of decision making as an entirely rational process. But note what Franklin said in comparing pros and cons: “I endeavor to estimate their respective Weights.”  That estimate is subjective, which means emotions came into play.  Research confirms that good decisions – including political ones – require reason and emotion.  When either is ignored, decision making suffers.

Neurologist Antonio Damasio wrote about “Elliot,” a model worker, husband and father who had an operation to remove a small tumor from his ventromedial prefrontal cortex (located in the front of the brain just above and behind the eyes) that integrates reason and emotion.  Afterwards, he seemed as intelligent as before, but he couldn’t make decisions, agonizing over them.  Damasio tested Elliot further and found he lacked emotional reactions even to images as striking as a severed foot. Damasio’s research and book, Descartes’ Error, concluded that emotions are essential to decision making.  Because of his surgery, Elliot could not integrate emotions with logic. 

How we combine reason and emotion is becoming clearer.  Harvard psychologist and neuroscientist Joshua Greene gave experimental participants this problem: 

A trolley is moving towards a fork in the track.  On one side are five workers.  On the other is one worker. The workers cannot hear or see the trolley coming.  You are standing by a lever.  You realize pulling it will divert the trolley onto the track that has one worker.  It will hit and kill him.  But if you do nothing, the trolley will stay on its current course, hitting and killing five workers. Would you pull the lever?  Answer for yourself before continuing.

Now, consider this problem: 

There is no fork in the track.  The trolley is heading toward five workers who cannot hear or see it coming.  There is no lever to pull, but a 300-pound man is standing on a footbridge over the track.  If you push him off it, he will fall onto the track.  The trolley will hit and kill him, but it will stop before it can harm the five workers.  Would you push the man off the footbridge?

Most participants in Greene’s study pulled the lever.  Very few pushed the man.  Yet in both cases the outcome would be identical: one life sacrificed to save five.  Why the different answers?

Greene monitored participants using functional Magnetic Resonance Imaging (fMRI), which showed which areas of the brain were active during decision making.  He found the more personal the dilemma (pushing the man vs. pulling a lever), the more the emotional regions of the brain got engaged. Putting our hands on someone else and pushing him to his death would raise, understandably, very strong emotions.  He also found that the emotional areas became engaged a little bit before the logical ones and that people who decided to push the man took longer to decide than people who refused – they had to overcome their emotional resistance.  The main conclusions were: (1) we feel before we think; (2) overcoming feelings in situations like this takes extra time because we have to allow logic to weigh in.

"Reason is and ought to be the slave of the passions, and can never pretend to any other office than to serve and obey them." – British Enlightenment philosopher David Hume

Hume’s observation is supported by the trolley study.  Not surprisingly, most protests, political campaigns and public policy proposals are geared to engage emotions.  Emotions are not inherently bad.  They’re only dangerous when they drive out reason.  Some of our nation’s greatest achievements have been propelled by strong emotions.  We would never have had child labor laws or unemployment insurance, for example, were it not for the emotions generated by personal and press accounts of children suffering and destitute laid-off workers and their families.  In such cases, reason turned emotional power into sensible policies.  Thinking citizens need to blend emotion and reason to make wise decisions. 

Our Subconscious Mind Shapes Political Decision Making

“But what most people are not aware of … is that most of our thought – an estimated 98 percent – is not conscious.” – cognitive linguist George Lakoff

Most people believe they do blend logic and emotions, consciously considering various sides of public issues. But our subconscious also gets into the act.  By definition, being subconscious, we’re not aware of its impact. 

One way our subconscious impacts political decisions is through emotional hijacking. 

(Credit: Jonathan Haidt, The Happiness Hypothesis)

Psychologist Jonathan Haidt suggests that our conscious, logical brain is like the small rider on a huge elephant, which represents our emotional brain.  That emotional brain can swamp conscious thought.  That can lead to poor decisions and, sadly, political violence.

Hennepin County Sheriff’s Patrol After Riots

(Credit: Chad Davis, Creative Commons)

On August 26, 2020, Eddie Sole Jr. of Minneapolis, sought by police for a homicide, committed suicide.  Coming soon after the police killing of George Floyd in the same city, false rumors spread quickly that police had killed Sole.  Despite surveillance video confirming his suicide, rioting led to four fires, damaged 72 buildings, forced 132 arrests and injured two police officers. 

Emotions like this can spread virally, as we saw frequently in protests against COVID mask wearing and lockdowns.  There are many ways such emotional contagion spreads.  Social media is one.  In 2012, researchers manipulated the news feed of 689,000 Facebook users to study what would happen.  Some saw a large number of positive posts; others saw a large number of negative posts.  The study (rightly criticized later) found those who got more positive posts, such as images of people embracing, posted more positive messages themselves. Those who saw more negative posts created more negative posts. 

Tweets can also spark emotions, an increasing factor in political and disinformation campaigns.  One study found tweeting and retweeting enhances brain activity indicative of emotional arousal by 75 percent. Just reading tweets increases emotional arousal by 65 percent. In civil society, emotional hijacking propels trolling, bullying and conspiracy thinking as well as political violence.

Fact Finder

What role does emotional contagion play in teen suicides?

Sometimes our subconscious thought processing impacts political decisions in less obvious ways. Researchers from Princeton had subjects look at photographs of the faces of pairs of people and pick the one of each pair they judged more “competent.”  Unknown to subjects, each pair contained front-runner candidates for governor or U.S. Senator in the upcoming 2006 election.  After the election, researchers noted that the person judged more competent won 72.4 percent of the senate races and 68.6 percent of the gubernatorial races.  Yet the amount of time subjects were given to look at the faces was less than one second. 

In a different study, Eric Helzer and David Pizarro asked Cornell University students to complete a survey about their political attitudes.  Some were moved near a hand sanitizer while doing so.  Others were moved against a wall much further away. Those standing near the sanitizer responded more conservatively.  Such findings are explained by the brain’s subconscious processing.  In these two studies, we subconsciously associate certain facial features with competence and the need to avoid uncleanliness with conservatism.  That doesn’t mean we’ve made good decisions - just ones not subject to conscious thinking.

Cognitive psychologist George Lakoff suggests that political thinking can be guided by deep metaphors we don’t consciously recognize.  In political disagreements, he illustrates, a subconscious metaphor guiding many people is: “argument = war or struggle”.  We see this in our language, such as “he won that argument,” ‘she shot down that proposal” and “his criticisms were right on target.”  These metaphors, Lakoff says, are created by clusters of neuronal connections that can be triggered without conscious awareness.  As another example, Lakoff explains George W. Bush’s political resurgence after his bouts with alcohol and business failure as due to a “redemption narrative” – a deep, subconscious metaphor triggered in voters that warmed to him because they saw him as someone saved by his faith and now worthy of empathy and trust.  

Fact Finder

Why was the “Willie Horton” political ad in the presidential election of 1988 so damaging to the presidential campaign of Democrat Michael Dukakis?

So What? Now What?

Thinking citizens need to stay alert to these four thinking traps:

  • Misusing System 1: using fast, automatic thinking when slower, more careful thought is needed

  • Untested Mental Models: using a model of “how things work” that does not apply

  • Emotional Contagion: allowing the emotional “elephant” to overwhelm the logical “rider”

  • Subconscious Control: not questioning how the subconscious mind impacts decisions

We can be better political decision makers. Here are a few ways to deal with these traps:

  • Use System 1 and System 2: When confronting a question about a public issue, realize you may have a “gut” response (System 1) that needs checking out (System 2).  We’re rarely experts on a political topic, so we should be alert to possible mistakes in our thinking.  Ask others, including people who may disagree with you, what they think about your thinking. Check your thinking with what true experts say (see Question #6). 

  • Check Mental Models:  If you believe “business cares only about profits,” that’s a mental model. We cannot NOT have mental models, but we can question them.  A study at Wharton found the mental models many people have about hurricanes makes them ignore government advice.  For example, they assume the chief concern is riding out the day of the storm, not realizing they may lack power and water for days after that.

To challenge a mental model in your role as a citizen:

  • Make the Model Explicit:  Describe exactly what you believe.

  • Acknowledge It May Be Wrong and Needs Testing.

  • Test by Asking Questions:

o   What facts support it? What facts might contradict it? Where can I look to verify or refute it?

o   What assumptions does it make?  How can I test them?

o   What values and beliefs I hold may blind me to its inaccuracy? For example, are my partisan loyalties affecting my reluctance to change it?

  • Interview People Who Don’t Share Your Mental Model.  Don’t argue; just listen to their values, beliefs, assumptions and facts.

    The late Harvard professor Chris Argyris developed a way to think about how mental models trap us.  We act based on our beliefs (mental models), but they come near the top of his “Ladder of Inference.” Our beliefs are based on conclusions we draw from assumptions we make, which in turn are based on meanings we derive from data we collect.  But the data we pay attention to at the bottom of the Ladder are themselves influenced by our values and beliefs.  That’s what leads people to use only sources that conform to their political ideology and ignore those that do not.  So, we get trapped by mental blinders. Argyris suggests we go down the ladder and look for data we’ve missed and how that might challenge our assumptions, conclusions and beliefs – and possibly argue for different actions to take.  

The Ladder of Inference

(Credit: design.ncsu.edu)

  • Balance Reason and Emotion: It’s common to hear people with different political views say “if they would just listen to reason!”  In reality, when we think we’re being reasonable and others are driven just by emotions, we’re wrong about them and ourselves.  We all use emotion and reason.  A big problem comes when we act solely on our emotions and use logic to justify it.  Recall the Trolley study which found our first reaction is emotional. Benjamin Franklin understood this problem of rationalization, saying in his Autobiography: "So convenient a thing it is to be a reasonable creature, since it enables one to find or make a reason for everything one has a mind to do."

As University of Virginia psychologist Timothy Wilson puts it about voting: "There's fairly good evidence that people vote from the heart, but if you ask them why they vote they'll come up with all sorts of logical reasons."

In addition to watching out for rationalizations, we can take other steps to balance reason and emotion as thinking citizens:

  • Name Your Emotions: This makes you aware of them and can help you set them aside, at least for a while.  The insistence that “my emotions are not getting into it” shows they are.

  • Quiet the Emotions:  Take a “time out” to lower your stress and make sure you’re not caught by emotional hijacking.

  • Watch Out for Emotional Spirals: Rising and falling emotions consume energy (see Question #3) needed for rational thought.

  • If You Want to Change Someone’s Mind: You have to understand their emotions.  Campaigns to convince people with facts are rarely enough.  We need to tap into emotions too – as visceral, frightening stop-smoking ads demonstrate.  

  • Make the Subconscious Conscious: In a September 2004 experiment, young voters in the Northeast were divided into two groups.  The control group was asked whether they planned to vote for John Kerry or George W. Bush for president. They preferred Kerry, 4 to 1. The experimental group was first "primed" with a questionnaire seeking thoughts on death and their own burial. They chose Bush by a 2 to 1 margin. This and other studies suggest that when people are afraid, they subconsciously lean conservative. These subjects were not aware of the impact of the “death questionnaire.” The only antidote is Lakoff’s admonition in his book, The Political Mind: “You must make unconscious politics conscious . . . When thought is conscious, you can discuss it, question it, try to counter it.  When it is unconscious, it has free rein.”  Making the subconscious conscious, of course is not easy.  One strategy is to invite others to question your decision.  In the Bush-Kerry experiment, one might just ask the “death questionnaire” participants if they think it is having an impact on their preference for president.

When we make a political decision, we have taken a stand on a public issue.  Yet we should not feel rushed to do so. In Question #5, we look at why we sometimes decide before we’ve done enough mental work – and how to hold off until we’re ready, despite pressures to “make up your mind.”