The Erosion of Trust in Science
Understanding and Addressing the Growing Divide Between Experts and the Public
Humans are amazing. We can land robots on Mars, edit genes to cure diseases and carry supercomputers in our pockets, yet we increasingly doubt the people who made these marvels possible. Welcome to a true paradox of our times.
In this issue of Mindful Sceptic, we explore why public trust in science and scientists is declining in an era of unprecedented scientific achievement.
We'll examine the multifaceted nature of this issue, from researchers' cautious communication to the cognitive dissonance we all experience. Along the way, we'll challenge assumptions and biases.
This exploration isn't just academic – it's crucial. The relationship between science and society has never been more critical in a world grappling with climate change, pandemics, and rapid technological advancement. How we navigate this relationship will shape our collective future.
Get ready to question, reflect, and perhaps discover new perspectives on science's vital role in our complex world.
Trust in science
Some politicians and the media pushed the impression that science is not to be trusted, which gained traction until COVID-19 came along. Then, politicians clamoured to share the stage with health scientists, who could provide reassurance to the public and bear the brunt of public wrath if recommendations went wrong. Briefly, there was confidence in medical scientists, but the trend is that science and scientists are increasingly not trusted.
A recent report by the Pew Research Center in the US looked at data from surveys of around 1,500 people representative of the general public in each of 20 different countries with mature or emerging economies across Europe, the Americas, and the Asia-Pacific, including Australia.
In all the surveyed countries, the trustworthiness of business leaders (9% believed them), news media (12%), and the national government (13%) was horrid—few of us trust people from pivotal social institutions.
More people are confident in scientists. At least a third of those surveyed trusted scientists to suggest what was in the public interest. This was about the same proportion that said they trusted the military, but this is still only one in three. A majority of people do not.
Interestingly, the same survey data suggests that two-thirds of people said it was better to rely on individuals with practical experience to solve pressing problems, while only one-third considered experts the best solution.
In public opinion, practicality is favoured over intellectual.
The logic of following advice from the ‘been there, done that’ person makes sense if you are, for example, trekking through the jungle. However, if the decision is about whether or not to cut the jungle trees down for timber and use the land for agriculture, then jungle survival is not the only expertise needed.
Forestry, agriculture, soil science, policy and planning, finance, cultural heritage, social development, biodiversity values, conservation… Many experts would have valuable input to a considered, evidence-based decision about the values and relative merits of converting a rainforest to a human enterprise.
I am not sure what is more disturbing, that only one in ten people trust their government to do the right thing or that two out of three prefer the tradesman to the academic.
There were some contradictory opinions, too.
Confidence in science is good
Most people in all 20 countries (82% on average) thought government investments in scientific research were important to the country. A little over half agreed that their country needs to be a world leader in scientific achievements.
So science is good!
But recall that across all the countries, only a third thought scientists ‘do what is right for the public’. Trust in science, perhaps, but not a great deal of confidence in scientists.
This is all very interesting as a precursor to how modern societies view and are likely to use the evidence that science generates—the scientific research—and view the messengers—members of the scientific community. It suggests that whilst most people think science is important and should be invested in, there is a lot of difficulty in understanding where this science should come from and a perception that scientists do not solve problems.
Love the process; don’t believe the messengers.
This is a whole rabbit warren of holes to go down in search of the erosion of trust in science and institutions and what to do about it. For a mindful sceptic, science is critical to generate evidence, so it helps to know why.
Why don’t we put trust in science or trust in scientists?
The common reasons people give for their lack of trust in science are that
Scientists are fence-sitters and can’t make a decision.
Science takes way too long.
Normative (opinionated) science is common.
Scientists deliver too many inconvenient truths.
Cognitive dissonance on the part of the listener.
Fence sitters
Scientists often appear to sit on the fence or even a hedge because they are sceptics. They value evidence-based conclusions and avoid making absolute statements without substantial empirical support. This cautious approach ensures scientific integrity by embracing uncertainty and encourages continual questioning and testing of hypotheses. You can’t be a good scientist without these core elements of the scientific method.
The natural world is complex, and there are limits to human knowledge, so when a problem presents itself without evidence, the scientist has to generate it. Sweeping statements before the evidence is in can make a mess of inference. And that complexity means it might not be possible to obtain definitive proof. For example, scientists can’t experiment on climate change because there are no replicates; you only have counterfactuals.
Add to this an ever-evolving knowledge base with new data, advancements in technology, and refined methodologies that consistently challenge established beliefs, and any smart scientist adopts a nuanced perspective rather than staunchly defends a singular view.
Scientists are not "fence-sitters," obfuscating their way out of making definitive statements. They are reticent because they have a commitment to accuracy, continual exploration, and the embrace of uncertainty inherent in the pursuit of scientific truth. It is no surprise that they pause before they make a decision and, when the evidence is lacking, choose not to make a decision at all.
Less trust in scientists because science takes too long
Science requires the scientific method, multiple stages of hypothesis formulation, experimentation, data analysis, interpretation, peer review, and validation. Each phase demands precision, repetition, and verification to ensure the reliability and validity of findings.
The peer review process, where research undergoes rigorous scrutiny by experts in the field before it is accepted for publication or wider acceptance, takes many months. However, is it a crucial step to ensure scientific integrity and minimise errors or biases.
Research in fields such as medicine, astrophysics, ecology and climate science, for instance, involves intricate experiments, large-scale data analysis, and long-term observations, all of which extend the timeline for significant discoveries.
Many of the smart questions for science are interdisciplinary and require time-intensive collaborations and specialized expertise. Building the networks and trust across the science tribes takes more than a couple of tweets or a pint in the pub. Even in my discipline of ecology, the systems guys don’t always get on with the organism-centred researchers.
Scientific progress relies on building knowledge that gains consensus and validation among the scientific community. Achieving consensus on groundbreaking ideas or theories requires extended debates, additional experiments, and multiple rounds of review.
The whole process has to get along on the smell of an oily rag because there is always a shortage of funds for all but the most commercial research questions.
In short, science does take too long because it has to.
Normative science
In some scientific fields, there's a strong emphasis on providing information in a neutral manner, detached from policy implications. An ecologist describing an ecosystem or nutrient exchange between soil and plant roots presents the information without directly advocating for particular courses of action. Instead, they lay the groundwork for decisions by presenting objective data and analyses.
Objectivity is hard when scientific findings directly intersect with societal, ethical, or political issues. Disciplines like public health, environmental policy, or economics often grapple with normative elements. Normative science refers to applying scientific knowledge or findings in a way that suggests a particular course of action, usually driven by societal values, policies, or ethical considerations. Saving the koala, let’s say.
Scientists changing from messengers and interpreters to advocates can quickly erode trust.



Inconvenient truths promote distrust
An "inconvenient truth" typically refers to a fact or reality that challenges established beliefs, societal norms, or commonly held perceptions, especially concerning contentious or sensitive issues. Humanity cannot reverse climate change, the koala does not need saving, cholesterol is essential for brain function, and other facts contradicting conventional wisdom are how science progresses.
This is why this newsletter has Reality Check issues.
Scientists are trained to be curious and answer questions, and they don’t get far in their profession if all their questions already have accepted answers. Most are motivated to find something new and maybe win a Nobel prize. Inevitably, this pursuit makes waves when it goes against conventional wisdom or spotlights tightly held values.
Cognitive dissonance
It’s a thing.
We all have inconsistencies between thoughts, beliefs, or actions that lead to a sense of discomfort, unease, or inner conflict. Everyone suffers from mild to severe mental discomfort or tension from holding conflicting beliefs, attitudes, or behaviours simultaneously.
We all try to reconcile the conflict with what psychologists call cognitive dissonance. We hold conflicting beliefs at the same time by bending them, justifying our actions, or seeking out information that aligns with our existing beliefs. After I painted ‘Just stop oil’ on my placard for Saturday’s march, I jumped online to book a flight for a holiday in Bali. However, that is okay because the Balinese need the tourists to recover from the economic shock of the COVID-19 pandemic.
Suppose the scientific evidence goes against a belief; it can trigger dissonance. We might choose not to believe it and justify that choice by saying we don’t trust the evidence or the evidence maker.
A decline in public trust in scientists?
There are plenty of reasons why a significant majority of people do not trust either the scientific messages or the messenger explaining the survey results and the impression pushed by the media, but it doesn’t make it right.
If we value our well-being, we must trust the evidence, however inconvenient. Then, if we choose to smoke cigarettes, eat the extra slice of cake and skip the daily exercise, we can put it down to cognitive dissonance and suffer the consequences.
More challenging is collective well-being.
Dissonance, in the aggregate, is a high-risk strategy because 8 billion humans are using up resources in pursuit of well-being on a finite planet with limits—another inconvenient truth.
Scientists dedicate their careers to rigorously investigating, analysing data and seeking empirical evidence to expand our understanding of the world. Most of them are well-trained and committed to the scientific method, with its foundations of transparency and evidence-based conclusions. And, sure, they are humans susceptible to biases, value judgements, or occasional errors. Nonetheless, the scientific community has robust mechanisms like peer review, replication studies, and openness to criticism that help mitigate human shortcomings.
We should learn to build trust.
And if we find that hard, then at least trust their evidence.
A mindful sceptic knows how to do both. The application of critical thought and curiosity mutes dissonance. The skills of scientific literacy, evaluation and intellectual humility provide the practical tools to understand and use evidence to question well, think smart and thrive.
While many value scientific research and achievements, there's a notable lack of trust in scientists. This paradox can’t get in the way because our most pressing problems rely heavily on scientific insight and innovation.
How do we, as a society, bridge this trust gap?
A mindful sceptic has a unique opportunity, dare I say a responsibility, to lead the way. By embracing intellectual humility, cultivating curiosity, and honing our critical thinking skills, we can model a balanced approach to engaging with scientific information. We can demonstrate how to question respectfully, evaluate evidence objectively, and adjust our views in light of new information.
Remember, trust in science isn't about blind faith - it's about understanding the process, acknowledging the uncertainties, and recognizing the immense value that scientific inquiry brings to our lives.
A Mindful Sceptic's Perspective
What does a mindful sceptic think about all of this?
The scientific process is slow, often uncertain, and sometimes delivers uncomfortable truths. Scientists can appear indecisive or fence-sitting when qualifying their statements or expressing uncertainty. However, these are benefits of a robust scientific process. Science's careful, measured approach is precisely what makes its findings reliable and trustworthy.
A mindful sceptic is vigilant against our own biases and cognitive dissonance. It's easy to dismiss scientific findings that challenge our worldviews or make us uncomfortable. Mindfulness allows us to examine our beliefs critically and be open to changing them in light of new evidence.
A mindful sceptic recognises the importance of distinguishing between scientific findings and their interpretation or application. Scientists' expertise in their fields is critical, yet the application of scientific knowledge often involves value judgments and policy decisions that go beyond pure science. Here, critical thinking skills are crucial in evaluating proposed actions and their potential consequences.
A mindful sceptic will see an opportunity to bridge the gap between scientific expertise and practical experience. Rather than opposing forces, they complement each other. How might the insights of those with practical experience inform scientific research, and how can scientific findings enhance practical problem-solving?
Key points
Public trust in scientists is a complex issue that reveals some intriguing paradoxes. According to the Pew Research Center survey, about one-third of people trust scientists to act in the public interest. While this is significantly higher than trust in business leaders, news media, and national governments, it's still concerningly low, with scientists trusted by only a minority of the population. This leads us to question: Why, in an age of unprecedented scientific advancement, is trust in scientists not higher?
One surprising finding is the preference for practical experience over expert knowledge. The survey suggests that two-thirds of people believe it's better to rely on individuals with practical experience to solve pressing problems, rather than experts. This preference for practicality over intellectual expertise may stem from a disconnect between academic knowledge and real-world application in the public perception. How can we bridge this gap between academic expertise and practical problem-solving in the public mind?
Several factors contribute to the distrust in scientists. These include the perception of scientists as "fence-sitters" who can't make decisive statements, the lengthy time often required for scientific processes, the presence of normative (opinionated) science, scientists delivering inconvenient truths that challenge existing beliefs, and the cognitive dissonance experienced by individuals when scientific findings contradict their worldviews. Each of these factors presents a unique challenge in building public trust. How can scientists address these perceptions while maintaining the integrity of the scientific process?
The concept of normative science adds another layer of complexity to this issue. Normative science refers to the application of scientific knowledge in a way that suggests a particular course of action, often driven by societal values or ethical considerations. While this can make science more relevant to policy and decision-making, it can also be problematic because it blurs the line between objective scientific findings and value-based recommendations, potentially eroding trust in scientists as neutral observers. How can we navigate the delicate balance between scientific objectivity and the need for actionable, value-informed recommendations?
Addressing the issue of declining trust in science and scientists requires a multifaceted approach. This could include improving science communication to make findings more accessible and relatable, encouraging transparency in the scientific process, promoting scientific literacy and critical thinking skills in education, addressing cognitive dissonance by helping people reconcile new information with existing beliefs, and fostering a culture of intellectual humility and openness to new evidence. As mindful sceptics, how can we contribute to this effort and help bridge the gap between scientific knowledge and public understanding?
So, here is a challenge.
In the coming week, identify one scientific topic you've been sceptical about. Approach it with fresh eyes and an open mind. Seek out primary sources, consider multiple perspectives, and engage in respectful dialogue with others with different views.
Reflect on how this exercise changed your understanding. If adopted more widely, how could this approach transform our collective relationship with science and scientists?
Tell us about it in the comments.
You might also like
In the next issue
Is the foundation of modern environmentalism built on an impossible promise?
Next week, we'll confront the uncomfortable truth about 'sustainable development'; a phrase so embedded in global policy that we've stopped questioning its inherent contradiction.
Join us for an evidence-based exploration of why continuous growth on a finite planet might be the most dangerous oxymoron of our time.