Core Idea
Somewhere between the science laboratory and the evening news, precision becomes approximation. The sharp edges of uncertainty get filed down into smooth talking points.
What begins as tentative findings in academic papers transforms into unqualified certainties in press releases, then into simplified slogans in advocacy campaigns, and finally into received wisdom that no one questions anymore.
We can call this system drift.
Ideas travel through institutions, media, and public discourse like Chinese whispers, losing nuance at each step.
The original conditions, caveats, and context that gave a claim to any validity disappear, leaving behind a hollow shell of authority that sounds convincing but stands on nothing solid.
Counterpoint
The comfortable assumption we all like very much is that mainstream institutions act as quality filters. Universities vet research, journalists fact-check claims, government agencies base policy on evidence.
So, when something becomes widely accepted, it must be because the system worked. The checks and balances held and the cream rose to the top.
The reality is messier.
Institutions have incentives that bend toward consensus rather than accuracy. Academic careers depend on publishing, not on being right. Media outlets need stories that grab attention, not stories that capture complexity. Advocacy groups require simple narratives that motivate action, not complicated truths that inspire paralysis.
Each step in the information chain adds its own distortions.
The peer reviewer who waves through familiar conclusions. The science journalist who cuts the hedging to meet word count. The policy advisor who cherry-picks studies that support predetermined positions. The public intellectual who packages uncertainty into confident predictions.
System drift happens because humans prefer simple clarity to complexity they are less likely to understand. None of us like confusion.
We want to know what to think, not what to think about. So we sand away the rough edges of doubt until we’re left with polished certainties that feel reassuring but may bear little resemblance to what the evidence actually supports.
The most dangerous drifted ideas are those that feel obviously true. They carry the weight of institutional authority without the burden of institutional rigour.
Thought Challenge
Trace the decay... Pick a claim you’ve heard repeated recently that sounds authoritative. Follow it backwards through the citation chain. Find the original research. What qualifications did the authors include? What limitations did they acknowledge? How many degrees of separation exist between their careful conclusions and the confident assertions you encounter in popular discourse?
Test the foundations... Choose a policy position you support or oppose. Map out the evidence base that supposedly justifies it. How much of that evidence consists of other people’s interpretations rather than primary sources? How many of the key claims trace back to the same small set of studies? What happens to your confidence when you strip away the accumulated authority and look at the raw materials?
Both exercises reveal how much of what we consider settled knowledge rests on surprisingly thin foundations. They train the sceptical instinct to look beneath the surface of consensus, to follow the breadcrumb trail back to where it actually leads.
Closing Reflection
Humans need simple stories to navigate complex realities.
Institutions need clear messages to justify their existence.
Markets reward certainty over accuracy.
A mindful sceptic’s task is not to rage against this machinery but to account for it. When everyone agrees about something, that agreement itself becomes a fact worth investigating. Not because consensus is always wrong, but because the process that creates consensus often has little to do with truth.
The most dangerous sentence in any field is “Everyone knows that.”
It usually means someone stopped checking.
Evidence Support
Ioannidis, J. P. A. (2005). Why most published research findings are false. PLoS Medicine, 2(8), e124.
TL;DR… most published research claims are likely to be false due to biases, flexibility in study design, publication pressures, and selective reporting. It demonstrates that consensus and “received wisdom” often rely on findings that may not replicate or stand up to re-analysis.
Relevance to insight… foundational for the concept of system drift: popular narratives in science become detached from original logic and evidence as unreliable findings are repeated and cemented into “truth” by institutional and media endorsement, regardless of their validity.
Greenhalgh, T., Snow, G. L., Ryan, S., Rees, S., & Salisbury, H. (2015). Six biases against patients and carers in evidence-based medicine. BMC Medicine, 13, 200.
TL;DR… systemic biases that emerge as evidence-based medicine moves from theory to practice, showing how simplified concepts become institutional dogma that overlooks diverse voices and contexts.
Relevance to insight… how institutionalisation and drift occur as foundational logic is lost amid popularisation and advocacy, reinforcing the importance of continual scepticism and reassessment.
Lazer, D., et al. (2018). The science of fake news. Science, 359(6380), 1094-1096.
TL;DR… misinformation amplifies and mutates as it moves through social networks, often starting from minor misinterpretations that ultimately become dominant, widely believed narratives.
Relevance to insight… how logic decays and illusion solidifies as narratives gain popularity and repeat exposure, untethered from original empirical reality.
Lewandowsky, S., Ecker, U. K. H., & Cook, J. (2017). Beyond misinformation: Understanding and coping with the “post-truth” era. Journal of Applied Research in Memory and Cognition, 6(4), 353–369.
TL;DR… psychological and social mechanisms by which originally careful, nuanced statements are reworked and simplified, especially via social media, culminating in widespread post-truth narratives.
Relevance to insight… system drift is systemic and continual—not a single error, but the outcome of persistent oversimplification as complex evidence is molded into comforting stories for mass consumption.
Oreskes, N. (2004). The scientific consensus on climate change. Science, 306(5702), 1686.
TL;DR… how scientific consensus forms and sometimes calcifies, analysing how repeated claims and policy messaging can become decoupled from foundational evidence and ongoing new research.
Relevance to insight… captures both the legitimate value and the risk of consensus narratives—demonstrating the need to interrogate surface-level beliefs and return to the primary sources beneath the public storyline.
These papers show that drift is real, dangerous, and pervasive. They prove that mainstream beliefs and popular narratives can drift away from logic and foundational evidence, justifying ongoing scepticism in important decision-making.




