Don’t trust the results of a new scientific study? There’s new evidence that a little humility from the scientist can go a long way.
A two-year study, conducted by researchers at the University of Pittsburgh, Vanderbilt University and the University of Vienna in Austria, looked at whether intellectual humility — the practice of admitting the limits of one’s knowledge — influenced whether or not scientists were deemed trustworthy and how likely people were to believe their findings. The results were published last month in Nature Human Behavior.
The series of five studies used examples from medical, psychological and climate sciences and surveyed more than 2,000 people across the U.S., spanning age, gender and political party gamuts. Researchers presented results communicated by made-up scientists — “Susan Moore,” “Shanice Banks” and “Robert Wilson” — to gauge how the public would respond.
High intellectual humility sounded like: “Susan Moore is not afraid to admit when she doesn’t know something,” whereas low intellectual humility sounded like “Susan Moore is not afraid to assert what she knows.”
When scientists were expressing high intellectual humility, people were more likely to trust them and to believe the findings they were communicating, compared to the low intellectual humility group, where trust and belief were diminished.
The scientists repeated a handful of their studies to see if gender or race played a part. They found that neither did.
“This study helps normalize being humble and not knowing everything,” said Matt Motta, assistant professor of health law, policy and management at Boston University's School of Public Health. “It’s a hard position for a scientist to be in. There is a lot of pressure for you to act like you know everything all the time.”
Trust in science has improved since last year but is still lower than before the COVID-19 pandemic, per a separate October report of 9,600 U.S. adults, from the Pew Research Center. That report found that 76% of those surveyed trust scientists “a great deal” or “a fair amount,” while 23% say they don’t have a lot of confidence in the group at all. This reflects a decrease of 10% since January, 2019.
The pandemic tested the limits of science communication when knowledge about the virus was changing daily — leaving some citizens feeling whiplashed and confused. That period of time taught researchers that trust and belief in scientists isn’t just hypothetically important. Whether people view scientists as trustworthy can impact whether they follow public health guidelines and thus how contained something, such as the spread of a virus, is.
“[COVID] was a difficult time because there was so much rapidly changing science,” said Jonah Koetke, a graduate student in Pitt’s psychology department and first author on the paper. “And science changes rapidly all the time. But I think in that case, it was so relevant to people’s lives that the rapid changing was especially jarring to people.”
Koetke said that when scientists were transparent about the changing nature of COVID and the limitations of their knowledge, that strategy worked to express intellectual humility and increase trust.
But when he and fellow researchers tested tangible communication approaches on survey participants, they found some aspects of intellectual humility backfired and led to decreased trust.
Trust in the scientist remained when the fabled scientist communicated with personal intellectual humility: being transparent about the limit of their knowledge. But when the scientist communicated limitations of the methods and results of a given study, it backfired for trust and belief in the results.
“We know admitting limitations can be helpful, but we are still figuring out the best ways for scientists to communicate that,” said Koetke.
Those results surprised the researchers, who expected that expressing limitations might garner confidence and trust.
This could be because not everyone is taught the ins and outs of the scientific process, which is iterative and self-correcting. Limitations are a common feature of scientific experiments, which are never 100% perfect, said Koetke.
If students aren’t taught about the scientific method — 2019 data finds that a third of adults don’t see science as iterative, a number that increases with less education — they might think study limitations are more like faults.
“People hear ‘limitation’ and think ‘the study must be bad,’” said Motta.
To mitigate misconceptions about the scientific method, Motta suggested pairing discussions about intellectual humility with education about science.
“We want to be focusing on ways scientists can present themselves in intellectually humble ways without doing harm to research,” he said.
Koetke and his colleagues are still learning the best ways to go about that.
“We really don’t know immediately how to communicate, ‘I am an intellectually humble scientist, and this is why you should trust me.’” Koetke said. “Saying that you’re so humble is not a very humble thing to do.”
The team is already working on follow-up research to learn more about intellectual humility and when it can be successful. He said they’re now focusing on data-driven approaches to parse out how real scientists communicate humility in their daily lives and identifying which aspects of that work to bolster trust.
“Definitely the goal is not to keep this hypothetical in the lab,” said Koetke. “We do want these to be strategies that people and scientists can use out in the real world.”
Motta is interested in seeing the degree to which intellectual humility can change the minds of science skeptics, or what he called “the persuasive middle.”
“More research is necessary to figure out who is more receptive to intellectual humility as a tool of science communication,” he said. Attitude change is really hard. … It makes it a little bit easier for people to change their mind when a person says, ‘I don’t get things right all the time.’ Maybe this could be the thing that saves us.”
First Published: December 1, 2024, 10:30 a.m.
Updated: December 1, 2024, 9:44 p.m.