The Uncomfortable Truth About Scientific Authority
Why We Trust Science Until It Challenges Our Beliefs
We trust science. Well, most of us do, most of the time.
Somehow, it has trickled down to us that the foundational principles of the scientific method, which emphasises systematic observation, experimentation, and analysis, will produce objective evidence… even facts.
And we accept that facts make sense.
We have learned to expect that science is grounded in objectivity to provide a structured way to understand the world through repeatable experiments and empirical data.
We see this objectivity as transparent, not least because findings can be scrutinised by other experts, reducing the likelihood of bias or error.
And we trust this process because it creates a sense of reliability, ensuring that conclusions are based on verified evidence rather than assumptions or beliefs.
Well, for the most part, we accept the logic of facts over opinion.
Moreover, we are told that scientists will increase the scrutiny of the facts generated by other scientists. Through the peer-review process, research is examined by other experts in the field who assess its validity, methodology, and adherence to ethical standards.
Experts check each other's work.
Then we see that most of this fact-finding through research gets done in institutions like universities, research centres, and government agencies, which, again, most of the time, adds another layer of credibility. Their longstanding reputations, strict ethical guidelines, and funding oversight reinforce the perception of integrity and trustworthiness in scientific work.
So, we absorb science results with a high degree of confidence.
Yet something curious happens when scientific evidence challenges our deeper beliefs or economic interests.
A dramatic disjunct suddenly appears.
A parent who readily takes their child to the doctor for a broken bone may refuse routine vaccinations, arguing they don’t trust “big pharma” despite relying on the same scientific medical framework for emergency care.
A software developer who uses advanced algorithms and data encryption daily might reject the findings of cybersecurity research on surveillance risks, labelling them as exaggerated fears, even though they work within the same field of science and technology.
A farmer who relies on precision agricultural technology, such as GPS-guided tractors and weather modelling, might reject climate change data that could impact long-term sustainability, even though both technologies are based on scientific predictions and modelling.
An individual who marvels at the live feed from the International Space Station and celebrates space exploration might also endorse flat Earth theories, rejecting the physics and astronomical knowledge that makes space travel possible.
If the evidence jars, we look for ways to ignore it. And there are plenty of ways to do that.
How to question the science
The easiest way to question objectivity is to claim that the scientific method was corrupted by external factors such as corporate funding, political agendas, or institutional pressures.
As soon as we feel that science is being manipulated to serve the interests of powerful entities rather than the public good, it undermines trust in the legitimacy of scientific conclusions. High-profile cases like big tobacco downplaying smoking risk or ExxonMobil downplaying and publicly denying scientific evidence on climate change while its internal research had confirmed the reality of human-caused global warming further erode this trust, leaving us wary of the motivations behind scientific research.
Then, we know it is easy enough to find conflicting scientific opinions amid the constantly evolving information generated by modern science.
In 1961, when George Cloney and I were born, there were approximately 500 universities worldwide. By 2025, there had been a 3,900% increase to over 20,000 to accommodate the growth in kids needing an education.
Most of these institutions carry out research. The sheer volume of research output makes it harder to keep up with the latest evidence that might contradict what you learned even a few years ago.
Then there are the stuff ups.
People make mistakes, and scientists are people, so there are plenty of historical cases where scientific authorities have made significant errors or caused harm.
And, of course, not all trained scientists are ethical. There are deliberate obfuscations, too.
Throw in the complexity of scientific topics that are difficult for the public to understand fully, and more than just finding an excuse not to believe specific evidence, a greater sense of alienation and mistrust towards scientific authority in general emerges.
So, even though we trust scientific authority, if we need to reject it or part of it for any reason, there are plenty of ready excuses.
Pick your adventure.
Why this disconnect?
Well, we like our existing beliefs a lot. We don’t like challenges to them, and we do what we can to justify them and make them real.
Psychologists label our ability to seek out information that supports their pre-existing beliefs while disregarding evidence that contradicts them as confirmation bias. The flipside is we quickly reject other scientific findings that challenge deeply held views, particularly in areas like politics, identity, or ideology.
There are other biases, too. Motivated reasoning is a neat way of looking at information based on a desired outcome rather than on the merits of the evidence itself. Can’t you see the trend is positive?
And then, we have the various forms of peer pressure and cultural influences where particular scientific stances are held, not based on the quality of the evidence but on the norms and values of the group.
When beliefs become tied to cultural or political identity, it becomes harder to separate science from ideology.
The uncomfortable truth
The uncomfortable truth is that we don’t have to believe what science tells us when push comes to shove.
We can readily accept and use the engineering solutions that science helps produce, but if it also generates objective evidence against a value we hold, it is easily dismissed.
That is what happens.
And we all do it, even the trained scientists.
So, where does this leave us?
How do we navigate a world where scientific authority is simultaneously more crucial and more contested than ever?
The answer lies in embracing both scientific evidence's power and limitations.
It means…
Recognising when we're unconsciously accepting or rejecting scientific authority
Understanding the difference between questioning evidence and denying it
Acknowledging the role values play in how we interpret evidence
Staying curious about both the what and the why of scientific claims
Because here's the real uncomfortable truth… scientific authority isn't just about being right.
It's about being thoughtful, curious, and willing to change our minds when the evidence demands it, not when we feel like it.
There is no app for that, but there is a concept.
And a book on how to do it.