We are told that social media exists to bring people together, to facilitate dialogue, to democratise the exchange of ideas. And because people like connection, the platforms promise it. They even sell it in their pitches to investors and include it in their terms of service.
The problem is that connection does not generate revenue. But engagement does and the most reliable route to engagement is not curiosity or collaborative problem-solving but conflict.
Social media algorithms are designed to maximise time on platform. They learn what keeps eyeballs locked, which, like it or not, tends to be outrage, division, the emotional hit of seeing your tribe vindicated and the other side humiliated.
Every feed is curated for retention, not for truth, so it will fuel the tension with every recommendation nudging you towards an extreme, the inflammatory, or the binary.
Platforms structurally privilege conflict over cooperation… by design.
A critical consequence of this retention requirement is that they divert collective energy from solution-building to division. The more time we spend algorithmically enraged, the less capacity we have for the slow, difficult work of solving anything.
Counterpoint
But, naturally, the standard narrative is that social media connects us and it democratises voice. It allows marginalised perspectives to be heard. It enables movements, organises protest, and spreads information faster than any medium in history.
All of this is partially true. But it is not the whole truth.
What social media actually does, structurally, is sort us into silos and set those silos against each other. The algorithm does not reward nuance. It does not reward bridge-building. It rewards the most extreme version of your position because that is what generates clicks, shares, and the emotional arousal that keeps you scrolling.
The platforms are not neutral town squares. They are attention merchants. Their business model depends on keeping you engaged, and engagement correlates most reliably with anger, fear, and tribal identification.
Every design choice serves that end.
The endless scroll.
The like counter.
The algorithmic feed that hides the boring and surfaces the incendiary.
Efforts to foster cooperative dialogue are not just ignored. They are actively undermined because collaborative problem-solving is slow, unglamorous, and requires sustained attention.
Outrage is fast, visceral, and spreads like wildfire. The algorithm knows which one pays the bills.
The consequence is a structural lowering of society’s collective problem-solving capacity. Innovative solutions become less likely as attention becomes siloed and combative. We spend our cognitive resources on symbolic battles instead of material progress.
And we perform our politics for an audience rather than negotiate solutions with adversaries.
That last one is especially insidious.
Social media is not connecting us. It is succeeding at dividing us, because division is profitable.
Thought Challenge
Track your nudges... For one week, notice every time social media pulls your attention towards outrage rather than curiosity. When does the algorithm show you content designed to make you angry? How often does your feed present you with opportunities for collaboration versus conflict? Write it down.
Design the alternative... Imagine an online platform deliberately optimised for long-term collective solution-building. What would it look like? What metrics would it track? What behaviours would it reward? Now compare those incentives with the mechanics of existing social media. Ask yourself which system is more likely to survive market competition.
Both actions sharpen the sceptical instinct. Instead of being captured by algorithmic manipulation, you learn to see the structure beneath the content.
Closing reflection
Being a mindful sceptic means recognising that the platforms are not neutral tools. There is a cavernous gap between what they promise and what they structurally deliver.
The algorithmic war on cooperation is real.
Evidence Support
Brady, W. J., Wills, J. A., Jost, J. T., Tucker, J. A., & Van Bavel, J. J. (2017). Emotion shapes the diffusion of moralized content in social networks. Proceedings of the National Academy of Sciences, 114(28), 7313–7318.
TL;DR… Moral-emotional language increases the spread of political messages on social media platforms, especially when utilizing outrage. Content that provokes anger and moral judgment is systematically amplified by network dynamics.
Relevance to insight… emotional, specifically outraged, messages are more successful at spreading than content that encourages collaboration or deliberation, pointing to algorithmic preferences for divisive material. This directly supports the insight that structural incentives on social media foster division over solution-building.
Matias, J. N. (2021). Preventing harassment and increasing group fairness in online communities with a community-based moderation system. Proceedings of the National Academy of Sciences, 118(50), e2024292118.
TL;DR… Out-group animosity drives higher engagement on social media, with algorithms tending to elevate content that pits groups against each other. Content promoting cooperation or bridge-building receives far less algorithmic amplification.
Relevance to insight… social media platforms actively reward antagonistic interactions and penalize cooperative dialogue. The findings offer direct evidence for why social media is structurally hostile to collective problem-solving and collaboration.
Qiu, X., Oliveira, D. F. M., Shirazi, A. S., Flammini, A., & Menczer, F. (2017). Limited individual attention and online virality of low-quality information. Nature Human Behaviour, 1, 0132.
TL;DR… Information overload and finite attention make low-quality, emotionally charged content just as likely to go viral as high-quality, informative content. The paper quantifies how popularity on social networks is weakly correlated with quality under realistic load and attention constraints.
Relevance to insight… explains why viral success is decoupled from substantive value and empathy-building, and instead relies on attention-grabbing conflict or sensationalism. The research is crucial for revealing how social media’s algorithmic structures suppress genuine cooperation and elevate outrage.
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.
TL;DR… False news spreads significantly farther, faster, and more broadly than truthful news, especially stories that provoke surprise or disgust. Bots play a role, but human attention patterns drive the viral advantage of false, inflammatory content.
Relevance to insight… how platform incentives reward divisive, misleading stories and penalize the slow growth of cooperative or fact-based information. It further reinforces the insight that society’s collective problem-solving capacity is diminished by algorithmic misallocation of attention.
Nieborg, D. B. (2015). Crushing Candy: The Free-to-Play Game in Its Connective Commodity Form. Social Media + Society, 1(2), 1–10.
TL;DR… The commodification logic of platform-based games illustrates how social and playful interactions are refashioned into structured instances of exchange for profit, which in turn incentivizes addictive and antisocial behaviors to maximize engagement.
Relevance to insight… a critical case study in the political economy of platform algorithms, mapping how connective features are leveraged for monetization rather than authentic social benefit. The work provides a translatable model for understanding how similar dynamics operate on social media, undermining cooperation and rewarding division.




