TL;DR
The systems meant to protect us from crisis are the crisis. Institutions built for episodic shocks now face permanent disintegration but continue to behave as if the emergency will end. Our psychology, wired for snakes, not systems, filters out the slow and abstract. The result is a civilisation accelerating into collapse with its eyes technically open but cognitively blind. This isn’t a failure of knowledge. It’s a failure of calibration. To survive The Long Emergency, we don’t need better plans. We need different minds.
Picture humanity as passengers on a chartered bus, everyone having paid their fare and settled in for what the brochure promised would be a pleasant day at the beach. The driver follows his navigation app with unwavering confidence, maintaining a cheerful conversation with the passengers in the front seats. These are the corporate executives, policymakers, and academic leaders who've paid for premium seating. The driver pretends to be interested in the discussions of quarterly projections, upcoming conferences, and the promising new technologies that will make the beaches even more enjoyable next season.
Meanwhile, the bus hurtles toward a cliff that some passengers thought they saw through the side windows, though most are absorbed in their devices, planning their beach activities, or debating the merits of different sunscreen brands.
From nowhere, a young woman in colourful clothes and braided hair stands up in the middle of the bus and yells at everyone that there is a cliff, right there, she screams at the driver to swerve, slam the brakes, or we are doomed. She waves her arms at the front window, where, sure enough, the rolling fields of corn suddenly give way to a fast approaching precipice.
However, the driver has programmed his route according to the most advanced GPS technology available. His institutional training instructs him to trust the system, maintain passenger confidence, and adhere to established protocols. Disturbed by the commotion behind them, one of the front-seat passengers glances up from her strategic planning documents, sees the cliff arriving at pace, then turns to the passengers behind to reassure everyone that the navigation technology has never failed them before.
This is humanity's predicament.
We are on the air-conditioned bus travelling fast along a road built for speed and efficiency with an excellent safety record, heading for the beach. And why not? Everyone needs a break, and it was a good choice for a quick getaway, a bargain we came across in a Facebook ad that received five-star reviews from everyone. The kids are super excited. Nobody has ever said anything about a cliff.
But there it is, close enough for even the latest technology air brakes to have trouble stopping the bus before it careers off the edge. Swerving might be an option, but no coach driver has ever been trained to initiate a powerslide to rotate a bus, even if physics allowed it to be done. So, through no fault of your own, here you are on a vehicle full of 8 billion humans heading toward a cliff of resource constraints, environmental degradation and an out-of-date social contract.
This predicament, which most passengers still haven’t noticed because they are absorbed in cat videos and their favourite TikTok influencers, is not because we lack intelligence or good intentions. We don’t see the hazard because we are calibrated for an entirely different kind of journey. Our institutions evolved to navigate familiar routes to predictable destinations. And before we were aware of holidays, our psychology developed to handle the immediate, the visible, and the controllable. Neither the driver nor the passengers are equipped to process the cliff ahead, which represents not a temporary detour, but evidence that the very concept of "getting to the beach" may no longer be possible.
The navigation system isn't broken; it's simply guiding us toward a destination that no longer exists. The map may function, but the terrain has eroded beneath our wheels. Entropy has done its work to dissipate energy, degrade structure, and left behind only a simulation of coherence.
Here is the thing.
We are going to have to confront the unsettlingly plausible reality that the ideas and the energy physics that got us into this mess won't get us out. We face unprecedented global challenges while clinging to obsolete frameworks, comfortable myths, and intellectual taboos that prevent the clear thinking our survival demands.

Consider the scale of our delusion.
We worship "sustainability" while burning through planetary resources at exponential rates. We treat technological innovation as salvation, while our most basic problems of feeding everyone, maintaining soil fertility, and preventing ecosystem collapse, are resisted or made worse by high-tech solutions. We refuse to discuss population growth despite adding 8,000 humans every hour to an already overstretched planet and instead claim that we need more babies in the West. We focus conservation efforts on cute pandas while overlooking the soil microorganisms that sustain us.
This systematic self-deception isn't accidental. It soothes our anxious minds as it serves powerful interests. But it's also profoundly dangerous when we are accelerating toward a reality of energy and resource constraints, whether we acknowledge it or not.
It is past time we performed intellectual surgery on these sacred cows. Not for the dopamine hit from contrarianism, but because questioning fundamental assumptions has become a civilizational necessity. When mainstream environmental thinking promotes oxymorons like "sustainable development" and "green growth," someone needs to point out the mathematical impossibilities. When economic theory ignores thermodynamics and ecological limits, intellectual honesty demands we call it for the fantasy that it is. The Second Law isn’t a metaphor. Every transaction leaks energy, every efficiency gain has a cost, and the dream of perpetual growth drips slowly into heat loss.
My decades as an academic ecologist and environmental consultant taught me that the gap between public discourse and biophysical reality grows wider each year. I've watched intelligent professionals nod along with sustainability rhetoric while privately acknowledging its impossibility. I've seen conservation campaigns exploit emotional manipulation rather than address root causes. I've observed how expert communities develop elaborate ways to avoid mentioning inconvenient truths about overpopulation, degrowth, and ecological overshoot.
So I am going to borrow from James Howard Kunstler and call the human experiment of a bus full of distracted people heading for a cliff, The Long Emergency, the precipice that everyone knows must be there but can’t acknowledge how close it might be. In his 2005 book, The Long Emergency: Surviving the Converging Catastrophes of the Twenty-first Century, Kunstler argues that "peak oil", which he defines as the depletion of cheap, abundant petroleum, will fundamentally disrupt industrial society as alternative energy sources prove insufficient to replace it. It is what we see happening today. Kunstler contends that declining oil availability will converge with climate change, economic instability, and other global challenges to create sustained crises. As transportation costs skyrocket, communities will be forced to develop localised, self-sufficient systems, particularly for food production, while large cities may face severe disruption. It’s a creeping emergency.
At its core, an emergency is defined by urgency, unpredictability, and the potential for significant harm if not addressed quickly. We use the term and its meaning in various contexts, including public health (e.g., a medical emergency), natural disasters (e.g., a flood or wildfire), and infrastructure failure (e.g., a power outage or structural collapse). Emergencies disrupt normal functioning and demand immediate decision-making, often with incomplete information and under conditions of stress. The primary aim in an emergency is containment, mitigation, or rapid resolution.
Emergencies differ from chronic crises or long-term risks in that they are framed as temporally bounded events. This justifies the use of emergency powers, rapid funding, or suspension of routine protocols by pretending they are temporary. Our current global predicament is an emergency in this sense. It is with us now, it is a massive threat to normal functioning and demands collective action, potentially draconian.
However, emergency also implies a return to a pre-crisis baseline once the emergency is resolved, reinforcing a "response and recovery" paradigm. This conventional understanding becomes problematic in the context of The Long Emergency, where threats unfold gradually, but with compounding effects, and defy short-term resolution. They blur the line between crisis and condition, demanding not just reaction but deep adaptation. In this sense, The Long Emergency is an oxymoron because it uses the language of immediate disruption to describe a structural transformation.
The bus taking folk to the beach for the day is an analogy for the inadequacy of our conventional emergency frameworks in grappling with persistent, non-linear collapse. It’s sensors and the driver at the wheel don’t see the cliff appearing on the highway. After all, it has never happened before.
This essay, an introduction to a series on Uncomfortable Intelligence, examines how The Long Emergency idea operates simultaneously across environmental, institutional, and psychological dimensions, creating a triple bind that explains why conventional responses consistently fail. While institutions remain trapped in crisis management paradigms designed for temporary disruptions, human psychology compounds these failures through cognitive biases that make us systematically underestimate slow-moving threats and overestimate our control over complex systems.
It’s why evidence-based policy recommendations are ignored, why rational arguments fail to motivate action, and why even well-designed institutions struggle to address polycrisis challenges. These things exceed human cognitive architecture. We simply don’t have the wiring for them.
The problem is that when multiple interconnected systems are simultaneously failing, we can't afford intellectual comfort food. We need frameworks that work in harmony with reality, rather than against it. We need the courage to examine whether our political systems can handle resource constraints, whether our economic models make thermodynamic sense, and whether our conservation priorities conserve anything meaningful. We have to find the wires and reroute them.
The sacred cows are blocking clear thinking and actively preventing the breakthroughs we desperately need. However, pointing this out is optimistic because real optimism requires an honest assessment of our predicament followed by an adaptive response. Intellectual contrarianism becomes a survival skill when conventional wisdom leads toward collapse.
The Long Emergency will be navigated successfully or not based largely on our collective ability to think clearly about what's happening. That requires intellectual courage to question everything, especially the ideas we most want to believe. Sometimes, tearing down sacred cows is the most constructive thing you can do.
So let’s begin with the current approach to crisis management and those passengers still scrolling their phones with this first premise…
Governmental, corporate, and academic institutions structure their responses around discrete crisis management when The Long Emergency represents a system transition, but this institutional blindness is compounded by human psychology that evolved to handle immediate, visible threats rather than abstract, slow-moving planetary boundaries.
Most institutions were built for shocks, not drifts. Their crisis protocols of budget buffers, emergency plans, task forces and the like work best when a flood hits, a market crashes, or a border ignites. These mechanisms presume recovery is possible because the system itself remains intact. This presumption made sense in an era defined by industrial expansion and governance by control. Threats were considered as occasional, if unpredictable, outside anomalies, not signs of more profound instability. But today’s disruptions no longer behave episodically. They accumulate and entangle. They resist resolution by systems still calibrated for bounce-back rather than breakdown.
The presumption of return to normalcy blinds institutions to the reality of The Long Emergency, which is not a transient event but a permanent state of overlapping, escalating disruptions. Climate change, for example, is a structural reordering of planetary and social systems because the weather is not going back to pre-1990 no matter how hard we wish it. Yet institutions still behave as if sandbagging after a flood or issuing a one-off emissions report constitutes an adequate response. In this way, institutional logic itself becomes part of the problem.
Most environmental planning still hinges on a return to normal. But climate volatility, ecological decline, and resource erosion don’t unfold on policy timelines or within jurisdictional boundaries. Yet the institutional mindset remains fixated on containment and recovery. It is as if we are in a pause, not a transformation.
When intensive agriculture drains soil carbon and destroys microbial life, the aftermath isn’t a return to baseline. Degraded soils don’t self-repair unless you have hundreds of years to sit by and wait. What’s left in degraded soils is not absence readily replaced, but a new condition.
Most worrying is that this myopia is not just bureaucratic. It’s cognitive. Human perception evolved to prioritise near-term, emotionally vivid threats like snakes, fire, or betrayal. We had no idea about atmospheric carbon or aquifer depletion. Our decision-making machinery still runs on Stone Age firmware that is primed for immediacy, inattentive to abstraction. Systemic threats are diffuse, cumulative, and lack the sensory triggers that mobilise action. Even when people intellectually grasp slow crises, they rarely feel them. Cognitive shortcuts like normalcy bias and availability heuristics mute the signal. We see enough to worry, not enough to act.
I bet you recognise the feeling.

Research consistently shows that government, corporate, and academic institutions default to discrete crisis protocols rather than systemic planning. A McKinsey history of U.S. crisis management traces decades of post‑9/11 institutional reform focused on episodic shocks, not long‑term systemic transitions. A RAND review critiques how overlapping, fragmented emergency frameworks reduce effectiveness even for conventional crises, let alone paradigm‑level structural shifts. The COVID‑19 UK Inquiry highlighted planning systems that resembled “a bowl of spaghetti,” underscoring how institutional complexity hindered a coordinated systemic response.
Psychological scholarship reveals that a tendency to downplay risks and cling to the status quo, known as normalcy bias, operates even in disasters, and over 80% of people exhibit this bias when warnings emerge. McRaney’s normalcy bias framework emphasises how people linger in denial or deliberation instead of acting decisively. Commentary on climate inertia applies this bias to phenomena like gradual temperature rise and groundwater decline, showing why abstract systemic crises struggle to mobilise behavioural change. The available signal is intellectually grasped, but lacks the sensory or emotional trigger that evolved minds require to convert knowledge into institutional or personal action.
We face a layered blind spot of institutions unable to adapt beyond crisis-response logics, and individuals whose minds resist perceiving the true nature of the crisis. This dual failure explains why so many rational policy prescriptions and scientific warnings go unheeded. The messenger (institution) and the recipient (public or decision-maker) are working within frameworks mismatched to the scale and character of the threat.
The premise is intellectually robust and explains observed patterns better than alternatives. The mismatch between our evolutionary and institutional architecture and Long Emergency challenges prevents us from seeing, let alone acting on, what’s truly unfolding.
So let’s explore a little further with the following premise…
The Long Emergency triggers powerful psychological defence mechanisms of denial, displacement, and magical thinking that allow individuals and institutions to acknowledge environmental data intellectually while maintaining business-as-usual emotionally.
When individuals are confronted with overwhelming, complex, or existential threats, they often respond with defence. An easy one is denial that can manifest in various forms, from outright rejection of evidence to subtler avoidance of emotionally distressing implications. If that fails, displacement will redirect anxiety into less threatening or more manageable issues, such as obsessing over individual recycling habits while ignoring broader structural drivers. Then there is magical thinking, which allows people to believe that technology, market forces, or political leaders will somehow solve the problem without requiring disruptive changes to their own lives or systems.
And if these don’t work, then blame the neighbour.
None of the defences are irrational in an evolutionary sense. They are helpful because they maintain psychological stability in the face of cognitive dissonance and existential dread. However, in the context of The Long Emergency, they work less well. The psychological splitting between intellectual awareness and emotional engagement leads to what sociologist Kari Norgaard termed “socially organised denial”. In her book, Living in Denial: Climate Change, Emotions, and Everyday Life (2011), she observed how people in a Norwegian community, despite intellectually acknowledging the reality and severity of climate change, simultaneously engaged in social practices and emotional strategies to avoid fully confronting or acting upon that knowledge. This leads to the "psychological splitting between intellectual awareness and emotional engagement", a state where people acknowledge climate change or ecological collapse in the abstract, yet continue to act in ways that reinforce the status quo. This pattern is not limited to individuals. Institutions also exhibit defensive inertia, clinging to habitual practices and symbolic gestures even while officially recognising the scale of the crisis.
This systematic disconnect helps explain the chronic underperformance of environmental policy, corporate sustainability efforts, and public engagement campaigns. Even when evidence is accepted, the underlying emotional and cultural defences remain intact, leading to shallow or performative responses.
Understanding this psychological architecture is critical for designing interventions that do more than inform. They must also disarm the emotional mechanisms that enable inaction or continue to fall short, no matter how compelling the data or urgent the rhetoric.
This premise accurately describes one of the most robust findings in environmental psychology. There is a systematic disconnect between intellectual understanding and behavioural response, mediated by predictable psychological defence mechanisms that operate at both individual and institutional levels.
It brings on the following premise about why we are so keen on the short-term fix…
Institutional decision-makers remain trapped in linear problem-solving approaches because of structural incentives and because human brains are neurologically wired to seek controllable, short-term solutions rather than accept uncertainty and complexity.
Institutions, by design, favour predictability, accountability, and incremental progress. They need to be around for the long haul and provide stability that people crave. The anchoring traits designed into them are reinforced by performance metrics, electoral cycles, quarterly reporting, and risk-averse cultures, all of which reward short-term success and penalise ambiguity or failure. However, structural incentives alone do not fully explain the persistence of linear thinking in the face of increasingly complex, interdependent challenges.
Neuroscience and cognitive psychology research suggests that the human brain evolved to prioritise immediate, solvable problems with transparent causal chains. This tendency toward cognitive closure, pattern simplification, and certainty-seeking reflects a deep-seated need to reduce psychological discomfort, avoid decision paralysis, and maintain a coherent sense of agency in the face of threat.
When a boomslang (Dispholidus typus), a type of poisonous tree snake, fell out of a Kigelia tree I was sitting under on the banks of the Chobe River in Botswana, I was grateful for the instinct to get out of its way so fast that I fell out of my chair. We all did; it was comical.
But we don’t live in a linear “avoid all snakes” cause-and-effect world. Humans reside in complex systems that are constantly in transition. Decarbonising economies, reconfiguring food systems, or adapting to ecological collapse are characterised by nonlinear feedbacks, time lags, emergent properties, and irreducible uncertainty. These dynamics are not only complex to model but also psychologically painful, as they violate our preference for causality, control, and closure. Consequently, decision-makers often default to reductive problem framings (e.g., "more innovation," "better regulation," "market incentives") that provide an illusion of control while bypassing the uncomfortable truth that many aspects of the polycrisis are unpredictable, uncontrollable, and without precedent.
The result is a kind of institutionalised cognitive bias. This is not simply a failure of leadership or will, but a neuropsychological mismatch between the challenges we face and the minds tasked with addressing them.
Even when individuals within institutions are aware of systemic risks, the collective logic of action pushes toward the familiar, the manageable, and the measurable. I have often advocated for this myself as a science advisor, when, more than once, I insisted that monitoring and evaluation of environmental actions is essential to understand what has happened. “Thank you, Dr Dangerfield, we’ll take that on notice”.
So here is the key insight. Overcoming institutional inertia will require a redesign of how they are governed and strategies that allow human cognition to better engage with complexity, ambiguity, and long-term horizons.
This is not easy.
Not least because other psychological barriers lead to the following premise…
The social psychology of institutional belonging creates additional barriers to adaptation, including threats to professional identity, organisational legitimacy, and group cohesion, making cognitive dissonance preferable to institutional transformation even when evidence overwhelmingly supports change.
Institutions are composed of people whose professional identities and sense of competence are tied to the stability and efficacy of the systems they serve. A paycheck can’t be the only reason for getting up at dawn to travel on crowded public transport or congested roads for hours to reach a city office. People feel a sense of purpose even in menial work. Any admission of institutional inadequacy is experienced as a personal threat.
When professionals confront evidence that their work, frameworks, or institutions are failing to meet the scale of systemic challenges such as climate change or ecological collapse, they are caught in a conflict between intellectual acknowledgement and emotional, social, or reputational risk. Imagine the emotional disconnect as you board a long-haul flight to another COP meeting, where scientists will present evidence that carbon emissions are still rising. Not even a Stoic can avoid that one.
I frequently observed this when advising government agencies. Being seen to be performing was almost as important as the performance itself, and significantly more important than the outcome, especially if it occurred in paddocks or nature reserves hundreds of kilometres away from the city offices. Sometimes, bureaucracies were purposely dispersed into rural areas, with staff spread far and wide, only to have to come together again for meetings, typically in the city, to regain that performance high.
Social psychology identifies this conflict between logic and emotion as a classic case of cognitive dissonance. Recognising a system is failing while also believing in its integrity or your role within it brings on mental discomfort caused by holding contradictory beliefs or values.
Group dynamics further amplify this reluctance to change. Within institutions, group cohesion and internal credibility often depend on shared beliefs about legitimacy, purpose, and progress. As a result, deviating from the dominant narrative can lead to marginalisation or reputational damage. or even raising uncomfortable questions. Studies in organisational behaviour show that institutions frequently engage in defensive routines to avoid confronting foundational contradictions. This includes dismissing critical evidence, reframing systemic problems as isolated issues, or doubling down on legacy strategies despite declining effectiveness. In such environments, institutional loyalty and the preservation of internal harmony take precedence over adaptive learning.
Moreover, professional identity is often rooted in the mastery of specific frameworks, tools, or protocols. When systemic change necessitates a paradigm shift, it implicitly renders existing skill sets and assumptions obsolete. This can provoke identity threat among experts, managers, and administrators, who may unconsciously resist transformation not out of ill will, but because it destabilises the foundations of their credibility and purpose. This is an acute consequence of AI, especially the rise of agents and specialised LLMs that can repeat tasks faster, more efficiently and with far greater resolution than a human who is easily distracted by office gossip around the cooler. If you are on the lower end of the legal system, you will need to climb the ladder fast. In this context, maintaining cognitive dissonance, where we know something is broken but act as if it’sn’t, is a psychologically safer path than acknowledging the need for radical institutional change.

The premise is so well-supported that we don’t even need to test it. Institutional actors are embedded in social, psychological, and professional contexts that make adaptation costly. These dynamics help explain why, even when faced with overwhelming evidence, many institutions choose symbolic gestures, incremental tweaks, or strategic ambiguity over the more profound transformations that The Long Emergency demands.
Rather than tackle the assumptions from first principles and plan for change, doubling down is preferable, which brings up the following premise…
This psychological-institutional feedback loop generates elaborate rationalisation systems, for example, sustainable development goals, net-zero pledges, or technological optimism that serve the dual function of preserving organisational comfort while providing the illusion of adequate response to existential challenges.
The combination of human psychological defences and institutional incentives encourages justification and symbolic action that sociologists refer to as organised irresponsibility. The Sustainable Development Goals (SDGs), net-zero pledges, and techno-futurist narratives often serve this function. While not inherently flawed, these frameworks frequently become instruments of deferral and displacement. They enable organisations to recognise crisis conditions stemming from climate change, biodiversity loss, or social collapse without significantly altering their trajectory. In other words, they preserve institutional legitimacy and psychological coherence while postponing transformative action.
This phenomenon is evident in the widespread adoption of net-zero targets, which rely heavily on speculative technologies such as carbon capture or sequestration, many of which are not yet scalable or proven. Here, have an offset for those emissions. These pledges allow governments and corporations to appear proactive, but they often delay real emissions cuts and externalise responsibility to future administrations or technological breakthroughs.
Likewise, sustainable development goals offer a broad moral consensus for their intent is laudable, but their vagueness and non-binding nature make them easy to endorse without implementing meaningful structural change. The psychological comfort comes from framing the action as happening, the crisis is under control, and progress is measurable, while all the time, the measurements are selectively defined or disconnected from planetary limits.
From a behavioural perspective, these rationalisation systems work because they satisfy both cognitive dissonance and institutional inertia. Individuals can retain a sense of ethical consistency ("my organisation is doing its part"), and institutions can protect their operational logic ("we’re aligned with global frameworks") without confronting the reality that meeting ecological thresholds may require profound structural change, not incremental reform.
As a result, symbolic action becomes a form of collective self-soothing, reinforcing the very dynamics that maintain business-as-usual while giving the impression of transformative effort.
So here is a challenge.
Go across to LinkedIn, the social platform where professionals hang out, and do a quick scroll. See how many of the posts, and especially the comments, help promote either cognitive dissonance or institutional inertia. The conversation will sound good, even fuel a little dopamine, but the chatter is a caricature of elaborate rationalisation systems.
This premise of rationalisation is accurate and well-grounded in critical environmental sociology, climate psychology, and organisational theory. And they exist because they serve both psychological and institutional needs. They function less as pathways to transformation and more as mechanisms of avoidance, cloaked in the language of progress. Understanding and naming this dynamic is essential if we are to have genuine systemic change.
And before we move on, let me name it more precisely and rub it in with the following premise…
Professional communities develop sophisticated forms of collective self-deception where technical competence in narrow domains coexists with inability to integrate information that threatens foundational assumptions about progress, control, and institutional adequacy.
Professional communities are built on shared norms and assumptions that define what counts as legitimate knowledge and practice within the field. You probably do not know about density-dependence, Lotka-Volterra equations, life tables or mark-release-recapture unless you are a population ecologist. And if you are a population ecologist, then you probably don’t know about a Giffen good, Pigouvian tax or deadweight loss that an economist bangs on about. These paradigms and their associated jargon create cognitive and institutional boundaries that foster technical depth but often at the expense of systemic integration. As philosopher of science Thomas Kuhn argued in The Structure of Scientific Revolutions, normal science operates within paradigms until accumulating anomalies force a paradigm shift. Until then, dissonant information tends to be ignored, reframed, or discredited.
In contemporary contexts such as climate science, economics, public health, and engineering, this dynamic manifests as epistemic silos. Professionals gain access to and then live in a bubble. So, economists model growth independently of ecological limits; engineers optimise infrastructure without accounting for long-term energy constraints; climate scientists produce risk models while deferring questions of political feasibility. These blind spots are not the result of incompetence or bad faith, but instead of institutionalised cognitive partitioning, where each profession stays within its zone of control. By staying in the silo, they don’t even know what is going on outside, and so must overlook the foundational assumptions of the system they are in.
The psychological premise of the streaming drama The Silo, based on Hugh Howey's Wool series, is described as being about control, information, and the human response to an oppressive, isolated environment. These are indeed the central themes of the narrative, where all is well until it isn’t. Drama ensues as a compelling exploration of how individuals and communities react when their entire reality is built upon a foundation of lies and control, and the decisive psychological struggle to uncover and accept a potentially devastating truth. But the institutionalised cognitive partitioning is enough to create emotional silos equivalent to the physical ones in the sci-fi.
Essentially, we surround ourselves with a subtle form of collective self-deception. Technical performance ensures legitimacy. In contrast, individuals and their collective effort remain blind to the system-level implications of their work. Then another layer of deception is added because most professional identities are built around narratives of progress, control, and institutional reliability. Entertaining the possibility that growth is ecologically constrained, that control is illusory in complex systems, or that institutions may be structurally incapable of addressing the polycrisis risks professional marginalisation.
For a time, I acted as a science advisor to a government agency in Australia responsible for supporting farmers who managed approximately 40 million hectares of agricultural holdings, over half the land area of the state. I could discuss soil, the importance of soil carbon, the practicalities of reducing fertiliser dependencies, the value of monitoring and evaluation, and even the benefits of predictive modelling with staff and senior managers. However, I would be given blank stares and then asked if 70,000 ha was an appropriate trigger value to review a regulation. Once I got so frustrated, I presented the hockey stick graph of human population growth for some shock and awe to a meeting of all the staff. Silence ensued, and my point about the importance of sustained food production was lost to the silo.

What we have created is a form of motivated ignorance, where inconvenient truths are not denied outright but are excluded from the domain of "relevant expertise." Even when such insights are acknowledged by the interdisciplinary panels or advisory reports that are occasionally commissioned, they are often compartmentalised and fail to reshape core professional practices. I have a small library of reviews and reports I prepared myself that gather digital dust.
The reality is that professional communities can and often do maintain high competence within specialised domains while engaging in systemic blind spots that preserve institutional legitimacy and personal identity. The result is a sophisticated form of collective self-deception that delays or dilutes the integration of knowledge necessary for responding to any emergency, let alone a long one.
And rather than propose a revolution or a tearing down of the silo so that occupants can emerge into a toxic atmosphere and certain death, the final premise of this introductory essay is as follows…
The triple bind of environmental pressure, institutional inertia, and cognitive limits demands design that works with human wiring. Expecting superhuman rationality from evolved primates is a failure by design.
Traditional policy approaches often assume that individuals and institutions will act rationally when presented with clear evidence and logical incentives. However, decades of research show that, as we have seen, bounded rationality, emotional reasoning, and various cognitive biases continually shape real-world decision-making. As Herbert Simon and later Daniel Kahneman have shown, humans are not perfectly rational agents but satisficers. We are cognitive organisms seeking good-enough outcomes under conditions of uncertainty and limited attention.
Policy tools have adapted. Nudge theory, modular governance, and default options work not because they enlighten, but because they reduce friction. Adaptive systems succeed when they demand less from cognition, not more.
But the triple bind has a fourth thread.
These same vulnerabilities—distraction, inertia, and bias—are fertile ground for manipulation. Power doesn’t need to censor when it can confuse. Strategic ambiguity, delayed reform, and techno-optimism keep extractive systems intact without appearing hostile. The more institutions stumble, the more plausible it becomes to blame the voter or celebrate the innovator.
Living with the triple bind of environmental pressure, institutional inertia, and psychological constraints is both conceptually sound and practically actionable. It offers a more honest foundation for designing adaptive, resilient systems based on what people and institutions are capable of under stress. Of course, this triple bind is also easily leveraged for advantage. It is highly desirable for those who would gain from harmony in the silos and their persistence.
There is substantial evidence that the very vulnerabilities created by environmental crisis, institutional inertia, and cognitive limitations are actively exploited by powerful actors to maintain control, delay transformation, and manipulate public perception. Adding a fourth bind—the opportunistic manipulation of these vulnerabilities by powerful actors—captures a critical, often under-acknowledged dynamic in the persistence of the Long Emergency.
When populations are overwhelmed, institutions are sluggish, and cognition is biased toward short-term comfort, conditions are ripe for misinformation, distraction, and manufactured consent. In this context, actors with concentrated political, financial, or media power can actively shape narratives, suppress dissent, and sustain extractive systems that benefit the few at the expense of the many.
This manipulation for power and profit is not always overt. It often appears as strategic ambiguity, policy delay, technological utopianism, or the framing of systemic issues as matters of personal responsibility. Fossil fuel companies funded denial and sold personal change. Leaders repackaged climate action as cultural threat. These tactics persist not because they’re brilliant, but because they exploit minds already exhausted and institutions already mistrusted.
It is not enough to design systems that are psychologically intuitive and institutionally flexible. They must also be resistant to capture by actors whose interests run counter to planetary and collective well-being. This means embedding transparency, accountability, and democratic participation into any adaptive architecture. It also means recognising that working with human limitations is also a defence against bad-faith manipulation.
Adding this fourth bind—manipulation—makes the model politically adult. If systems aren’t designed to resist capture, they become tools for those who benefit from delay. Psychological realism must be matched with political realism. Designing for human minds is not just ergonomic. It is defensive architecture against bad-faith actors.
Any serious approach to the Long Emergency must work with what we are, account for who has power, and expect resistance from those who profit from collapse.
No true mindful sceptic is naive, at least not on purpose. The world is a complex place where the biophysical realities imposed by 8 billion people and their livestock, pulsed by fossil energy, are changing everything. There was the pre-agriculture ecology of the planet, then the post-agriculture one that was different but held on to a few key attributes it had honed for billions of years, and now there is the post-industrial revolution ecology that is operating under a brand new set of conditions.
Just one statistic is enough to show how different. Here it is…
96% of the mammal biomass on Earth today is humans and their livestock.
Paleoecological reconstructions, fossil data, and population modelling suggest that around 15,000 years ago, before the Neolithic Revolution, human populations were small, mobile hunter-gatherer groups that likely numbered fewer than 10 million globally. In contrast, the Earth still supported abundant populations of large wild mammals across all continents (except Antarctica), including mammoths, mastodons, giant ground sloths, aurochs, and other now-extinct megafauna, alongside still-extant species such as deer, wolves, and elephants.
Researchers have suggested that wild mammals may have represented close to 100% of mammalian biomass in this period, with humans making up around 0.01% to 0.1%. The domestication of animals had not yet occurred, so livestock biomass was zero. In effect, the ratio has inverted over the Holocene where wild mammals have shrunk by over 85–90%, and human/livestock biomass has exploded due to exponential human population growth and the expansion of agriculture.
You cannot unsee this number.
And I am not presenting it because I am fond of the koala or sad that, in 1988, I stood next to one of the last Black rhinoceroses in Zimbabwe as it was being translocated to a game ranch in South Africa… for its protection. I am telling you because you don’t know it. And I want you to understand why you don't. It is because the narratives you are told and the institutions that peddle them forget to mention the fact. They hold on to many a sacred cow. We could say that…
The very frameworks we use to understand and address challenges are themselves part of the problem.
Built on assumptions of linear causality, control, growth, and human exceptionalism, these frameworks often obscure more than they reveal. Rather than offering neutral tools for problem-solving, they shape what counts as a “problem,” who gets to define it, and which solutions are considered viable, frequently reinforcing the status quo they purport to critique.
Over time, these frameworks become self-reinforcing. People who spend time in institutions are often trained to prioritise risk management and efficiency over systemic transformation and ecological embeddedness. Academic disciplines segment knowledge into siloed domains, rendering cross-scale and cross-sectoral dynamics invisible. Economic models routinely externalise environmental degradation, treating planetary boundaries as abstract “externalities” rather than foundational constraints. Even the language of sustainability is often framed in terms of mitigation, adaptation, or resilience, which imply the preservation of existing structures, rather than questioning whether those structures are themselves the drivers of unsustainability.
This means that many of our most earnest responses to crisis remain trapped within the logic of the systems that created the crisis in the first place. For genuine transformation to occur, we must be willing to interrogate and revise the cognitive, institutional, and epistemological frameworks we take for granted.
We have to apply some sacred cow surgery.
So much of the conventional wisdom needs a question, some contrary thinking or even an excision.
"We just need better policies, more funding, and stronger leadership to address environmental challenges" is fundamentally incompatible with Long Emergency challenges. Reform efforts are psychological comfort food that distract from the need for entirely different approaches.
"Evidence-based policy recommendations will eventually prevail as the data becomes undeniable" is never going to get very far because human cognitive architecture makes us systematically incapable of responding rationally to slow-moving, complex threats. Even brilliant analysts are evolved primates with Stone Age brains.
"Experts understand these challenges and provide solutions; the problem is implementation" is a huge blind spot that professional communities develop as sophisticated self-deceptions. Expertise can become a barrier to seeing system-level inadequacy.
"We need to scale up our crisis response capabilities to handle bigger challenges" but the Long Emergency isn't a bigger version of previous crises. It's a fundamentally different phenomenon that makes traditional crisis management counterproductive.
"Innovation and technology will solve environmental challenges as they have solved previous problems" is perhaps the strongest psychological defence mechanism that allows institutions to avoid confronting the need for fundamental transformation.
"If people were just more aware/educated/motivated, we could solve these problems”, only this isn't about individual moral failure but systematic misalignment between human psychology, institutional design, and environmental reality. And even if there was some personal responsibility to be regained ,it is unlikely to operate at the scale of 8 billion psychologies.
So here is the thing.
We live in an age of unprecedented professional expertise, yet our most pressing challenges seem to worsen despite decades of sophisticated analysis, well-funded initiatives, and earnest reform efforts. Across domains from environmental sustainability to public health, from economic development to conservation biology, highly trained professionals armed with advanced degrees and impressive institutional affiliations continue to apply frameworks that consistently fail to deliver transformative results.
This isn't a failure of intelligence or commitment by genuinely capable people working within systems that reward expertise and evidence-based thinking. Yet something fundamental appears to be missing from our collective professional toolkit.
The problem may lie not in the quality of our analysis, but in the assumptions embedded within our analytical frameworks themselves. When sustainability professionals promote "green growth" while planetary boundaries collapse, when conservation biologists focus on charismatic megafauna whilst ecosystem functions deteriorate, when policy analysts recommend scaling up approaches that have already proven inadequate at smaller scales, we're witnessing the intellectual equivalent of performing ever more sophisticated surgery on a patient whose underlying physiology we've misunderstood.
We might wield tools of risk assessment, stakeholder engagement, evidence-based policy, technological innovation and the like as precisely calibrated instruments, but they're often applied to problems they weren't designed to solve, within systems they weren't meant to transform.
It is time for some uncomfortable intelligence.
It can be hard to get one's mind around why we are letting the world go to hell in a handbasket, when we know the damage we're causing. But when one realised a few fundamental things about humans it becomes clear (at least for now - I might have it all wrong and someone will put me right).
As you've implied, without perhaps saying it explicitly, humans are a species. Like all species, they evolved based on mutations that were beneficial now, not beneficial over the long term. This is how evolution works. Humans are no different from other species in that they maximize energy throughput. Humans are superb at accessing resources and chanelling energy, which is why we've done so much damage. True, for hundreds of thousands, even millions, of years, we seemed to fit in as part of climax ecologies but I think this was an illusion because innovations and discoveries that allowed us to succeed, in the moment, occurred very slowly. Humans had to make the discoveries first and then work to perfect them. And there were so few humans. Some of the innovations allowed humans to spread around the globe and wreak even more havoc in ecosystems where we didn't evolve. Humans can't stop being species just because some of us realise the effects we're having.
Also, it's clear that there is no such thing as free will (for example, see Robert Sapolsky's book, Determined: A Life Without Free Will). There are only the neural nets that have evolved as part of us (same for all species with a brain). Our decisions are the results of the firing of a set sequence of neurons, that we have no conscious control over. So the only way to get humans to act differently is to alter the way neurons connect and what prompts them to fire. Providing information to the brains can cause some alteration but most of how a brain developed is already baked in: our genes, the environment and epi-genetics, our upbringing, our culture. Then there is the weather, how badly we slept, the toxins we breathe in or consume, and so on. Discovering and communicating reality is just one part of what goes into our neural make up. So it can't change quickly.
But even if our brains could be rewired in a way that gets us to act, what would we try to achieve? A sustainable way of life is not one that includes the use of non-renewable resources (since they are a one-off limited resource). The only sustainable way of life, I can think of is a hunter-gatherer existence using only simple wooden or woven tools and equipment. But that isn't the type of existence that 8.2 billion people can undertake on the planet as it stands. So what do we want people to do? This is something I'm wrestling with. It's not going to be possible to rewire the brains of enough people to willingly regress to a hunter-gatherer existence. Perhaps we just have to be happy with a return to some way of life that was much much simpler, perhaps from a few centuries ago in western nations. But for all peoples of the world?
I just don't see a path forward but maybe with enough minds thinking about it, we can come up with just about acceptable ideas? One thing I know, though, is that we have to realise what is actually possible and sustainable, even if only for a few millennia.