What was once a universal tool for expression has quietly become a system built for manipulation. Social scientists are noticing a disturbing trend: Platforms that once prioritized connection and communication are increasingly taking on the structure of video games.
Most of us have experienced “gamification” in everyday life, whether you’re “playing” Hinge or climbing leaderboards on fitness apps. As a result, many sociocultural values are being reshaped, or even sacrificed, by this virtual architecture.
Philosopher C. Thi Nguyen offers valuable insight into this rapidly monopolizing trend in his 2021 article, “How Twitter Gamifies Communication.” Nyugen argues that Twitter warped discourse by incentivizing its users with gamified systems.

From this, two resulting dynamics emerged: Echo chambers, which reward conformity and punish dissent, and “moral outrage porn,” the hedonistic pleasure of moralizing for likes.
Gamification extends far beyond Twitter. Many platforms have conformed users’ initial goals into quantifiable points. LinkedIn turned professional growth into badges and points, and FitBit transformed health into a contest involving streaks and leaderboards.
In both cases, the central purpose of the application — networking and wellbeing — is distorted into an addictive performance metric.
Extremist networks are no strangers to this distortion — in the past few decades, there has been a rise in extremist content, leeching itself onto internet subcultures including health, fitness, productivity and philosophy without much pushback.
James Hawdon, director of Virginia Tech’s Center for Peace Studies and Violence Prevention, attests that among 15 to 21-year-olds in the United States, reported exposure to online extremist messaging increased from 58.3% in 2013 to 70.2% in 2016.
Why is this the case? A gamified internet presents itself as the perfect host for parasitic thinking. Oversimplified content that fosters moral outrage consistently drives engagement, and extremists exploit this dynamic to spread ideologies disguised as normalcy.
Youtube is the perfect example of this notion. Its recommendation algorithm has been shown to aggressively push viewers toward extreme content. A 2022 Brookings study analyzing thousands of video recommendations found a small minority of users ended up in ideologically skewed “rabbit holes” — aided by Youtube’s prioritization of emotionally intense, polarizing videos.
It doesn’t help that Youtube’s Chief Product Officer Neal Mohan said 70% of what people watch on YouTube is driven by recommendations.
Algorithms aren’t inherently malicious — they merely optimize for user engagement. Of course I’d rather watch “6 Vegans vs 1 Secret Meat Eater” than a BBC video on vegan diets.
However, this system doesn’t translate well in the world of extremism. When the most engaging content doubles as the most radical, it stops being an outlier and instead becomes a predictable outcome of the platform’s design.
A 2025 survey by the Safeguarding Network found that nearly 60% of 5,800 teachers in the United Kingdom believed social media had negatively affected student behavior, citing inflammatory influencers like Andrew Tate as a key example.
Educators reported boys as young as 10 using manospheric rhetoric — such as refusing to listen to female teachers, idolizing manosphere figures and using insults like “simp” and “cuck.”
The fact that students also recognize alt-right symbols like “red-pilling,” a reference to “The Matrix” in which a person “wakes up” from progressive culture, “Pepe the Frog,” a meme co-opted by white supremacists, manospheric slang such as Chad/Stacy/Becky and the “80-20 rule” — the false belief that 80% of women are attracted to 20% of men — shows just how deeply highly polarizing ideas pervade mainstream culture.
Extremist celebrities and platforms also make palpable use of engagement optimization techniques. An article written by Linda Schlegel contends that these subcultures motivate users through competition, achievement and socialization — which has led to the increased platform of this content today.
An article by the Institute for Strategic Dialogue outlines how figureheads mix “red pill” leveling with “hustle culture,” filtered through platform dynamics and reinforced by social media. Within this warped social economy, moral transgression is both rewarded and enforced.
A 2022 study published in Frontiers in Psychology found that hate content can reduce resistance to radical ideas and normalize violence. Likewise, a 2024 PubMed study reported that both incidental algorithmic exposure and an active selection of radicalizing content correlate with stronger violent-extremist attitudes.
In this sense, extremism isn’t just thriving on a short-lived digital era riddled with gamification — it’s a structural inevitability within our digital ecosystems. Platforms that champion outrage, quantize identity and prioritize engagement above holistic truth will always foster the most provocative, vehement voices.
The contradiction here is these systems feel participatory — every click, post and like feels like agency. However reality tells a much different story: It’s the inherent platform design and algorithms that are truly at the helm.
And as long as platforms retain their gamified structures built on status, points and inflamed content, we will all fall victim to this system at least once – whether we came to play or not.














































































































