The conference room is small. It smells of morning coffee and the occasional whiff of aftershave. I am there to present evidence, as all science advisors should.
I show charts and make the data points gleam, as I try to speak with the clarity of mathematics and the ambiguity of statistics. Another evidence-based policy is being born.
In the room are the policy people from the Department, and they are stuck on how many hectares of native vegetation can be cleared before the review trigger is activated. It’s a new and contentious policy proposal that allows land clearing in return for management of native vegetation offsets.
My presentation concludes with my familiar confidence. If I show them the facts, rational minds will prevail.
The problem is that human brains were not built for spreadsheets. They were built for survival.
And in this room, jobs are on the line.
Core Idea
Evidence-based policy rests on a seductive myth that humans are fundamentally rational creatures who, when presented with compelling data, will adjust their behaviour accordingly.
The uncomfortable truth is that our brains remain wired for immediacy, proximity, and tribal belonging. So, a distant glacier melting registers less urgently than a neighbour’s disapproval. Statistical projections about future pandemics cannot compete with the immediate discomfort of wearing a mask.
Evidence-based policy assumes we have evolved past these instincts. We have not.
Instead, we run with the delusion that presenting better data will overcome biology that took millions of years to embed. I click the pointer onto the next slide.
Counterpoint
The standard policy framework, familiar to everyone in the shabby conference room on the seventh floor, follows a predictable sequence… identify problems, gather evidence, present findings, expect behaviour change.
It works beautifully in theory and consistently fails in practice.
The problem my evidence was supposed to help fix was the delicate politics between farmers who always want to control and improve the land they farm for production and conservationists, most of whom live in the cities, who are desperate to prevent any more trees from being ripped out by a chain.
But there is no evidence able to resolve what is a values’ argument. And in this case, values that are politically aligned. Should it be 40,000 ha or 120,000 ha or no trigger value at all?
The policy establishment continues to believe that better evidence will eventually win the day. More precise models, clearer presentations, more accessible language. Yet human psychology remains stubbornly unchanged. We respond to stories, not statistics. We react to immediate threats, not distant projections. And we trust tribal messengers over neutral experts.
Evidence-based policy assumes rational actors in a rational system. In reality, it operates within emotional, tribal, biologically constrained decision-making systems that evolved to handle immediate, local challenges. The mismatch is profound and largely ignored.
The trap is complete when policymakers conclude that failure means they need better evidence, rather than better understanding of how evidence actually influences human behaviour.
I said that the trigger value is not critical to the overall policy that was supposed to get farmers to improve land in return for some vegetation clearing. Ensuring they made appropriate efforts to manage native vegetation remnants was the key to habitat improvement.
But no.
All that crucial detail was lost in the noise of the review trigger value.
Thought Challenge
Analyse a recent evidence-based campaign that failed to generate widespread behaviour change... Map what it showed versus what people actually felt. Was the evidence compelling? Was it emotionally inert? Did it require people to care about abstract future consequences rather than immediate present realities? Write down the gap between the evidence and the emotional experience it generated.
Experiment with reframing... Take a systemic threat you care about and translate it into terms that would register as immediate and local. Instead of global temperature rise, try neighbourhood flood risk. Instead of species extinction rates, try local food security. Instead of economic projections, try family budget impacts. Test whether the reframed version feels more urgent than the abstract version.
Both exercises reveal how rarely policy design accounts for the emotional and cognitive filters through which evidence must pass before it influences behaviour.
Closing Reflection
Being a mindful sceptic about evidence-based policy means acknowledging that the evidence is often right, but the use is consistently wrong.
Humans do not make decisions like computers processing data. We make decisions like animals assessing threats, opportunities, and social signals. We recognise values, especially those that impact our worldview.
The solution is not to abandon evidence, but to embed it within frameworks that account for how minds actually work. This means designing policies that make distant threats feel immediate, abstract consequences feel personal, and rational choices feel emotionally satisfying.
And always see that policy is all about values.
Evidence-based policy will remain evidence-ignored policy until it learns to speak the language of human psychology, rather than demanding that psychology learn the language of evidence.
Evidence Support
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735.
TL;DR… increased scientific literacy and numeracy did not lead to more accurate perceptions of climate change risks; instead, individuals used their quantitative skills to reinforce views that aligned with their cultural group. Science communication that simply presents more data does not overcome the cognitive and tribal filters through which people interpret information, demonstrating the futility of rational data dumps when group identity or emotion is engaged.
Relevance to insight… the “rational actor” model underlying evidence-based policy is systematically undermined by the deep-rooted psychological wiring that guides real-world decision-making.
Weber, E. U. (2006). Experience-based and description-based perceptions of long-term risk: Why global warming does not scare us (yet). Climatic Change, 77(1), 103-120.
TL;DR… warnings about global, abstract risks like climate change fail to evoke meaningful action, finding that threats distant in time or space are cognitively discounted and do not trigger immediate emotional responses. The paper elaborates how evolutionary pressures have shaped human attention and concern to be short-term and local, rendering evidence-based messaging about systemic, distant threats broadly ineffective.
Relevance to insight… call for policy frameworks that “hack” cognitive bias, rather than relying on rational appeal to future or distant risks.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263-291.
TL;DR… how people’s choices deviate systematically from rational expectations, especially under risk, due to cognitive biases like loss aversion, present bias, and affect heuristics. Emotional factors and immediacy powerfully distort the way evidence is processed and acted upon, especially compared to rational cost–benefit calculation.
Relevance to insight… underpins much of modern behavioural policy critique: campaigns that ignore these psychological realities invariably fail to produce the intended change.
van der Linden, S., Maibach, E., & Leiserowitz, A. (2015). Improving public engagement with climate change: Five “best practice” insights from psychological science. Perspectives on Psychological Science, 10(6), 758-763.
TL;DR… synthesise psychological research to recommend actionable strategies for creating climate policies that resonate emotionally, stress local impacts, and leverage social identity, rather than relying on abstract statistics. Their findings support the view that emotional salience and personal connection drive engagement, not presentation of data per se.
Relevance to insight… how reframing systemic risks as immediate and identity-relevant is more effective than the evidence paradigm alone.
Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2007). The affect heuristic. European Journal of Operational Research, 177(3), 1333-1352.
TL;DR… how affect (emotion) unconsciously steers judgment and risk assessment, showing how people’s perceptions of evidence are continually filtered through emotional and intuitive processes. When evidence lacks an emotional hook, it fails to motivate action, regardless of its objective merit.
Relevance to insight… evidence-based approaches are structurally limited without mechanisms for triggering emotion or a sense of immediate relevance.
Each paper demonstrates that human cognition is not calibrated for cool, rational absorption of evidence from policy presentations. Instead, our brains weigh threats by vividness, proximity, and tribal urgency. These works provide the empirical, theoretical, and practical grounding for rethinking how policy must hack innate tendencies to achieve real-world change.




