Terry Newell

Terry Newell is currently director of his own firm, Leadership for a Responsible Society.  His work focuses on values-based leadership, ethics, and decision making.  A former Air Force officer, Terry also previously served as Director of the Horace Mann Learning Center, the training arm of the U.S. Department of Education, and as Dean of Faculty at the Federal Executive Institute.  Terry is co-editor and author of The Trusted Leader: Building the Relationships That Make Government Work (CQ Press, 2011).  He also wrote Statesmanship, Character and Leadership in America (Palgrave Macmillan, 2013) and To Serve with Honor: Doing the Right Thing in Government (Loftlands Press 2015).

Think Anew

Recent Blog Posts

Question 6: What "Experts" Should I Trust?

Car in Which Milena Del Valle Died

(Credit: rapperport.com)

Angel and Milena Del Valle were driving to Boston’s Logan International Airport on July 10, 2006 when twelve nearly two-ton concrete roof panels in a tunnel collapsed, killing Milena. This was a tragic result of a massive, problem-ridden construction project dubbed the “Big Dig.”  Planning began in 1982 as experts from multiple specialties and government authorities sought to resolve downtown Boston’s traffic congestion.  Construction started in 1991 with an estimated completion date of 1998 at a cost of $2.8 billion.  The “Big Dig” was not completed until December 2007 at a cost of $8.8 billion. A 2008 Boston Globe report concluded that traffic waiting times actually increased for most travelers as bottlenecks simply moved from the less-congested city to the suburbs.  Clearly, experts made some big mistakes.

Yet experts also get it right. Scientists who studied COVID and developed vaccines in record time, engineers who gave us the computer chip, the Internet and cell phones and meteorologists’ severe weather forecasts, just to cite a few examples, have improved our lives in many ways.

Fact Finder

Why have weather forecasts gotten better since the very first ones made in the 1860s in England?

We Need Experts But Often Distrust Them

At the founding, Americans celebrated experts.  Benjamin Franklin was an esteemed scientist, Jefferson was the founder of American paleontology and recorded weather observations for 40 years. Washington insisted on smallpox inoculation, preserving his Continental Army.  But as democracy spread, respect for expertise did not always keep pace. 

The belief in equality led many to assume anyone could do almost anything.  Further, the failures of experts get lots of attention. Asbestos that caused mesothelioma and pesticides that damaged the environment soured many on science.  Economists missed the coming of the 2008 financial crisis.  Intelligence analysts missed 9/11’s use of planes as flying bombs. International trade agreements didn’t deliver enough jobs to the Rust Belt.   

Many now scorn experts despite our heavy dependence on them.  This was perhaps inevitable.  As historian Richard Hofstadter noted: “Once the intellectual was gently ridiculed because he was not needed; now he is fiercely resented because he is needed too much.”  Skepticism about experts has also been fed by general feelings of distrust in major institutions, spurred on by radio talk show hosts like the late Rush Limbaugh who called “government, academia, science, and the media” the “four corners of deceit.”

(Credit: Gallup Poll and statista.com)

Perhaps, as professor Tom Nicols said in The Death of Expertise, Americans just expect too much: “Laypeople cannot expect experts never to be wrong; if they were capable of such accuracy, they wouldn’t need to do research and run experiments in the first place.”

Know How to Judge Who’s an Expert

So how can we spot the expertise we can and should trust? Here are at least three tests.  

Neil deGrasse Tyson

(Credit: Houston.culture.map.com)

  •  Test #1: Education, Experience and Recognition by Peers

Dr. Neil deGrasse Tyson is an astrophysicist and author, studied at Harvard, the University of Texas and Columbia and did postdoctoral research at Princeton.  He is director of the Hayden Planetarium and spends considerable time communicating to the public through his National Geographic radio show StarTalk. We expect true experts like Tyson to be highly educated in their field, have wide experience and be recognized by respected peers.  Of course no expert is perfect, but the best learn from mistakes.  As physicist Werner Heisenberg put it, an expert is “someone who knows some of the worst mistakes that can be made in his subject and how to avoid them.”     

  • Test #2: The Expert Testimony Standard

Another approach comes from Rule 702 of the Federal Rules of Evidence. Someone is qualified to give expert trial testimony if they meet Test #1 and their testimony is based on scientific facts gathered by reliable and properly applied methods.  Case law adds that their scientific methods must be generally accepted, subjected to peer review and their testimony must be limited to matters based on their research.  They must avoid unjustified leaps from their work and must have accounted for alternatives that could explain their conclusions.   By Tests #1-2, a lot of people who post their “expert” views on social media should be ignored.  They may have opinions – even strong ones – but they are not experts.

  • Test #3: The Good Forecast Test

On some questions, even experts that meet Tests #1 and #2 are wary of making predictions.  They realize they can’t be certain, for example, when the next recession will start, the next pandemic will strike, or whether Russia will soon invade another of its neighbors.  Instead, they may make forecasts, which are well-informed ideas about what the future holds.  Expert forecasts are usually couched in probabilities, such as “there’s a 70 percent chance of afternoon showers.”  President Obama, as another example, would not give the go-ahead in the raid that found Osama Bin Laden until his advisors told him there was much greater than a 50 percent chance that they had found the terrorist.   Also, keep in mind that just because someone made a prediction that turned out to be correct does not mean all their future predictions will be.       

In Superforecasting, Philip Tetlock and Dan Gardener report on preparing good forecasters.  Those in their Good Judgment Project for the government’s intelligence community even outperformed professional intelligence analysts who had access to classified information.  When given a question (e.g. what will be the price of oil next year?), they used several techniques to develop a forecast.  They:

  • Searched for many ideas on the question from people and sources with very different perspectives,

  • Aggregated information from these sources,

  • Synthesized what they learned into a point of view, with a probability attached,

  • Assumed their point of view was wrong and forced themselves to come up with others,

  • Formed a “miniculture” of people who would respectfully critique their work and invited them to do so, often regularly, and

  • Got clear and timely feedback on how they performed, knowing they couldn’t improve next time if they didn’t know how they did.

Note that experts and expert forecasters apply System 2 thinking (see Question #4).  They’re also more comfortable with ambiguity and less likely to fall prey to information bias and the illusion of knowledge (see Question #5). 

So, when you see a story with a tagline like: “Jones Concludes the Dow will Plummet 65% by Next Year!” ask yourself whether this person meets any of these three tests of expertise. When you feel you’re an expert on a public issue, ask yourself how you developed that expertise. The three tests argue for being more humble about who we listen to – and how sure we are we’re right.

Let’s Talk

These two people have a large following.  Do they qualify as experts?

a.  MSNBC’s Rachel Maddow

b. FOX News’s Tucker Carlson

Avoid Thinking Mistakes When Relying on “Experts”

Mad Money’s Jim Cramer

(Credit: Tulane University)

CNBC’s popular “stock tip” show, Mad Money, has made Jim Cramer a rock star. A lawyer and former reporter and hedge fund manager, Cramer touts his ability to help viewers invest wisely.  Yet a study conducted on his stock tips is less flattering.  Over a four-year period, stocks he recommended opened the next day 2.4% higher (perhaps people rushed to buy them), but those holding them for 50 days lost on average 10 percent and those holding them more than 50 days lost on average nearly 30 percent. 

Presumed experts often don’t live up to their hype. In another Tetlock study, 284 experts of political and economic trends made more than 82 thousand predictions which he then compared to what happened.  His conclusion: “the average expert was roughly as accurate as a dart-throwing chimpanzee.”  Those with Ph.D.’s did no better than those without.

Presumed experts, like all of us, may fall prey to thinking traps.  If we know what they are and what causes us to sometimes trust “expertise” too much, we are in a better position to use expertise wisely.

  • Overconfidence

Boston’s Big Dig “experts” were way too confident and those who approved the project bought into this overconfidence.  One reason: overconfidence “sells.”  Just ask Jim Cramer.  Few people listen to someone who isn’t confident.  Research suggests that overconfident people are rated higher and under-confident people are rated lower, regardless of their actual capabilities.  “Experts” themselves may be unaware of this because overconfidence feeds the ego.  We (and they) may also overestimate their skill and underestimate the importance of luck in their success.   So before accepting a very confident person as an “expert,” you can do some Web surfing to find out their track record (see other suggestions in the “So What?” section below).

  • The Power of Authority Figures

We tend to give extra weight to “authorities.” Cable news and talk shows are rife with “experts” who opine (not always with expertise!) on all kinds of public issues.  The more animated - and sometimes the more their advice fits with viewers’ political ideology - the more they get invited back and the higher their audience appeal. 

Elected officials have a built-in believability quotient - at least among their followers.  At the start of the COVID 19 pandemic, President Trump called the anti-malarial drug hydroxychloroquine a “game changer,” urging its use. He based this on anecdotal evidence not subjected to rigorous scientific study.  His recommendation spurred a run on the drug, which research eventually concluded is ineffective against the virus. 

This danger of succumbing to authority shows up even when the “authority” is an Internet search.  Physicians bemoan patients who think they are expert diagnosticians because “I researched it on the Internet.” 

Be wary as well of purportedly “expert” websites.  Years ago, AskMe.com offered users the chance to ask questions and get free advice.  Marcus Arnold began dispensing legal advice on the site and was eventually rated tenth by users of the 150 people doing so.  When people began asking for his contact information and fee structure, Arnold revealed that he was a 15-year-old high school student with no legal training.

  • The Allure of Stories

    Stories often carry more weight than isolated facts.  Stories add context, suggest what facts mean and evoke emotions.  In an experiment, one group heard epidemiologist Dr. Anthony Fauci share the science that showed the MMR (measles, mumps, rubella) vaccine doesn’t cause autism.  Another group first saw a video of a mother who claimed the vaccine gave her child a severe rash - and then saw Fauci.  Those who saw only Fauci had a stronger positive view of vaccination than those who also saw the video.  Stories can be convincing, but they don’t always rest on expertise.

Let’s Talk

How do you decide whether to believe experts on such topics as the economy? health care? Why do people disagree so much on such topics?

Knowing How Scientists Work Should Guide How We Use Scientific Expertise

Bloodletting

(Credit: The Burns Archive, Wikimedia.com)

In December 1799, George Washington rode his Mt. Vernon fields despite wintry weather.  His throat soon began to close, causing breathing difficulties.  His attending physician drew 16 cups of blood from his veins in a 10-hour period, a treatment that probably hastened his death.  Bloodletting began in ancient times and continued well into the nineteenth century in the belief it could cure or prevent diseases as varied as cancer, smallpox, diabetes and even “heart-sickness.” Since the germ theory of disease was not accepted until the last half of the nineteenth century, support for bloodletting relied on stories of its “success.”  The fact that a sick person got better after bloodletting  was considered proof.  The evidence that many people got worse or died was attributed to something else.

Bloodletting like Washington’s is crack science.  This story demonstrates a few of the most important things we must understand as consumers of scientific expertise.   

  • Anecdotes are Not Science.  In the old American West, traveling pitchmen peddled “snake oil” to cure disease.  A hired “shill” in the audience testified that it worked.  After selling to gullible customers, the snake oil salesman left town before customers found out that advertising, not science, was all they purchased.   When a teacher claims her new method of teaching math is an astounding success, that’s an anecdote.  When others in her school also swear by the new approach, that’s a collection of anecdotes, but neither claim is science.

  • Science Works Through Testable Hypotheses. Science requires a proposal – an hypothesis - that can be tested and found likely true or not via an experiment. The teacher’s claim about the new math approach can be tested.  To be credible, such an experiment must have two groups.  One gets the “treatment” (the new math method) and the other (the “control” group) does not.  Then results for each group are compared.  Controlled experiments come in many types, but the core point is that the hypothesis is only confirmed through careful procedures yielding statistically valid and reliable evidence.

  • Experiments Need Large Sample Sizes.  When “8 out of 10 doctors recommend ToothSaver toothpaste…” it’s possible only 10 were asked!  In a small sample, a few odd results can occur that would carry much less weight with a larger sample. If testing the new math teaching approach with 100 students in one school seems promising, it needs to be expanded to many more schools and students.

  • Correlation is Not Causation.  It’s tempting to conclude that improved student test scores were caused by the new teaching approach.  Just because one thing happens along with another (correlation) does not prove one caused the other.  Roosters crow in the morning, but that doesn’t cause people to eat breakfast. Scientists won’t claim causation until they can rule out all other explanations.  Were the teachers chosen for the experiment already the best in the schools?  Were the students in the treatment and control groups similar or were there significant differences in their math abilities, sex, race, home environment?  Could the attention given the students itself have made them try harder?  When scientists are reluctant to say “X causes Y,” we often complain: “why can’t they just make a decision?”  We should appreciate their carefulness.

 “the results of a single experiment should not be taken as proof of anything.  Unfortunately, the proselytes of anti-science ideas are constantly making this error.  Thus, if 50 experiments contradict their beliefs and a single experiment seems consistent with them, certain individuals will seize upon that one experiment and broadcast it.” - Sara Gorman and Jack Gorman, Denying to the Grave

Fact Finder

Does science support these beliefs?

a.     Flu shots can give you the flu

b.     Sugar makes kids hyperactive

  • Experiments Help Build a Theory but Rarely find “The Truth.”  The experiment conducted by one researcher must be repeatable with the same results by other researchers for its findings to be considered promising. A theory of how things work to produce certain outcomes may result from a number of such experiments, but it too must be tested by even more experiments.  This can take years. Even then, scientists will be reluctant to declare they’ve found “truth” because contrary evidence may later emerge.  This is why you find scientists saying things like “there is a 95 percent probability that this new math teaching approach is associated with improved test score.” We may want certainty, but science is comfortable with uncertainty. Be wary of “experts” who are too certain.

  • Science Thrives on Criticism.  Politicians and the public often jump on the fact that scientists disagree with each other.  In 1989, two researchers claimed they produced energy through “cold fusion.” Since cold nuclear fusion of atoms could generate a lot more energy at far less cost than the “hot” atomic fission in nuclear reactors, this seemed almost too good to be true.  The popular press spread the story.  But the experiment was greeted skeptically by other researchers, and the claim was later disproved. The general public may see this as proof you can’t trust science, but such critical review is how knowledge advances.  This is why the report of a new math teaching approach will not get into an educational journal without peer review by other experts who first carefully scrutinize the experiment.  

 “The greatest asset of the scientific tradition, after all, is its persistent self-scrutiny and skepticism.  Embracing uncertainty . . . is the defining feature of science.  It is not a defect.”

Jamie Holmes, Nonsense: The Power of Not Knowing

So What? Now What?

Thinking citizens who want to use expertise face several thinking traps:

  • The Illusion of Expertise: the predictions of many so-called “experts” are no better than chance.

  • The Power of Authority: we tend to accept what “authorities” say though they may lack true expertise.

  • The Allure of Stories: good stories generate strong emotions.  If we’re not careful they can be taken as truth even when not based on expert knowledge.

  • Overconfidence: we are swayed by how confident not necessarily how capable someone is.

  • The “Science Supports Me”: we may accept seemingly scientific claims when they have not been subjected to rigorous scientific procedures.  

 Some action steps to find and use expert knowledge include: 

  • Be a Critical Thinker. Neurobiologist Jan Engelmann scanned the brains of two groups of subjects making financial decisions.  One group was given expert advice but the other was not. Each subject was then asked to make a decision. Those given advice conformed to what the experts recommended and the part of their brains associated with critical thinking was diminished.  They off-loaded the decision making to the experts.  As citizens, we’re flooded with “expert” advice.  Given our busy lives and inability to know what “experts” know, we’re prone to let others decide for us.  In a democracy, if we too easily let others tell us what to think we turn our votes over to them.  So:

 o   When you hear “evidence” and stories meant to persuade you, ask: (a) are these people really experts (see Tests #1-3)?; (2) do they have a political ax to grind?; (3) what do their critics say?

 o   Get an “outside” view.  Those responsible for the “Big Dig” might have benefitted from talking to other cities that mounted massive construction projects and thus learned that most took longer and cost more than projected.  The typical “experts,” especially on partisan political issues, are talking to like-minded people.  Their claims are often based on “inside information.”  So go outside that circle to find other points of view.  Get perspectives from those with different experiences and sources who have no political ideology guiding them.  Find a “devil’s advocate” - someone who takes a contrarian view on popular wisdom – but make sure they have some expertise to do so.

 o   Be careful about the media’s principle of “fair balance.” Many news outlets pride themselves on reflecting all points of view on an issue. In some cases, this can warp the reality of what true experts think.  If a TV station interviews a climate scientist on global warming, it may feel the need to interview a global warming denier.  Such “fair balance” may give the impression that both sides have an equal claim to scientific expertise, even if they do not.

  •  Find Out if an “Expert” Really is – or Find a True Expert.

 o   Put the “expert’s” name in a search engine and/or Wikipedia.org and review their education, experience, publications and track record.

 o   Put the topic on which you need expertise in a search engine, but be aware that names/organizations that come first on a search list are often those who’ve paid to be listed first.  Also, you can go to GoogleScholar, which is focused on academic research and see who is publishing on it. 

o   Search the professional association(s) in a field to find an expert and/or see if a purported expert is a member, has published in their journal(s) or spoken at their conferences.

 o   Find out who funds the expert’s salary/research. Is it a for-profit, non-profit, government or politically aligned organization?  Does that organization have a commercial or ideological “ax” to grind? 

 o   Expertise is not necessarily transferable, so if someone is an expert on one issue, you still need to assess their level of expertise on a different topic.  Neil deGrasse Tyson is an expert on cosmology, but that doesn’t mean he is on climate change.

 o   Check out an “expert’s” recent posts/claims using fact-check websites, such as www.factcheck.org and www.snopes.com.

 “Americans find scientific information easily if they want it.  What they can’t get so easily is the desire to want it.  What they – and we – need to remember is that we can deal with a changing world more effectively when we learn how to think like the best scientists.”

- Eric Liu, Become America

  • Think Like a Scientist.  In a Pew Research poll on understanding science, only 40 percent could identify the need for a control group to test a new drug.  Without greater scientific literacy, we are prey to poor science and falsified or misleading information.  Before concluding that a purported scientific fact is one you should accept, ask:

 o   Is this just an interesting anecdote and nothing more? o    

o   Was there a scientifically testable hypothesis?

o   Is there evidence from experiments with control groups?

o   Were there multiple, confirming experiments with sufficiently large samples?

o   Were assertions/experiments subjected to peer review/criticism?

 While the “expertise” you’re being offered may not meet all these tests, the more it fails them the more skeptical you should be.

 There are some organizations, in addition to fact-checking sites, that can help, such as SciMoms - a non-profit group of mothers who are scientists organized to help people “make evidence-based decisions and to follow facts, not fear” on environmental and health issues, especially ones that concern parents.

  • Practice Intellectual Curiosity.  Gen. David Patraeus, when Commander of U.S. Central Command, liked sending officers to top universities not just to learn but for the experience of being around other very smart people who thought differently than they did.  Nobel laureate Daniel Kahneman warns about the WYSIATI trap.  “What You See is All There Is” - the tendency to assume the information you have is sufficient.  Given a “headline grabber” report of how raising the minimum wage did not lead to job loss in “City A,” don’t assume that’s all you need and stop your search for information.  There’s more to look at.

 Intellectual curiosity also means restraining your ideological bias.  A study at Ohio State University found that conservatives distrusted scientific information more than liberals when it questioned their political views about climate change and evolution.  Liberals distrusted science more when it disagreed with their views on fracking and nuclear power.  In both groups, views on neutral topics like geology or astronomy did not lead to questioning science.  When relying on ideology more than expert knowledge, the political majority may easily overrule facts. That’s not the goal of America’s founders, nor should it be the goal of thinking citizens.

Expertise comes in handy when exploring public issues, especially as many of them are complex.  Question #7 focuses on things to consider when facing a public issue and some ways to make sure your thinking is open to all the considerations such issues require.