The Radicalisation Pipeline No One Wants to Talk About

By Farah Benis

Over one million VAWG-related crimes were recorded in the UK during 2022/23, accounting for 20% of all police-recorded crime. At least 1 in every 12 women in the UK will be a victim of violence this year alone - and the true number is expected to be much higher due to underreporting.*

This is not a fringe issue. This is systemic violence, and it is being actively fueled by a growing online ecosystem that radicalises men against women. The internet was meant to be a space for connection, learning, and progress. Instead, it has become a recruitment ground for extremism, radicalisation, and targeted hatred - and women are paying the price.

What was once dismissed as “just online” behaviour has escalated into something far more dangerous. Misogyny isn’t just present in extremist movements - it is a gateway into them. The radicalisation pipeline that moves men from casual sexism to full-fledged violent extremism is well-documented, yet rarely treated with the same urgency as other forms of radicalisation.

For years, we have tracked and countered terrorist recruitment, ideological extremism, and disinformation campaigns. But one of the fastest-growing radicalisation pipelines, the industrialisation of misogyny, is still largely ignored in security conversations.

Online, a structured and deliberate process is radicalising men into hostility, dehumanisation, and, in some cases, direct violence against women. What starts as casual sexism - misogynistic memes, social media influencers promoting “alpha male” culture - escalates into ideological extremism, where men are taught that women’s rights are an existential threat to them.

The final stage? Some take action, moving from online rhetoric to real-world violence, inspired by the communities and figures that groomed them.

This is not a hypothetical risk. We’ve already seen attacks - from incel mass murders to targeted violence - perpetrated by men radicalised in online misogynist spaces. Yet we do not treat this with the same urgency as other extremist movements.

From Jokes to Justifications - The Digital Pipeline of Misogyny

Radicalisation is rarely instant - it’s a process, one that leverages psychological, social, and algorithmic factors to push individuals toward extremism. The digital pipeline that leads young men into misogynistic radicalisation follows a clear and measurable pattern:

1. Normalisation of Misogyny (Entry point - the hook)

  • Casual sexism is disguised as humour or self-improvement.

  • Young men encounter memes, TikTok clips, and YouTube videos mocking feminism, portraying women as manipulative or untrustworthy.

  • They engage with influencers who package misogyny as part of a “self-improvement” journey - teaching men to be “dominant” and emotionally detached.

  • The language shifts subtly: from “men and women are different” → “women are the problem.”

At this stage, misogyny is framed as normal, entertaining, and even beneficial. This makes it easy for young men to engage without realising they’re entering a radicalisation pipeline.

Algorithmic Reinforcement & Echo Chambers (The Push)

  • Social media platforms and video algorithms push extreme content.

    • A man who watches a fitness or dating advice video is fed progressively more radical content.

  • Engagement thrives on controversy and outrage - platforms reward extreme takes, amplifying misogynistic content.

  • Young men are introduced to Red Pill, Men’s Rights Activists (MRA), and anti-feminist influencers.

  • Online communities become echo chambers, reinforcing the belief that women are enemies, and that men must “fight back” against feminism.

At this stage, many are fully absorbed into these spaces, consuming content daily and engaging in hostile discourse. The “us vs. them” mentality begins to take hold.

3. The Mainstreaming of Extremist Ideologies (The Justification)

  • What used to be fringe beliefs become mainstream through repetition and community reinforcement.

  • Influencers like Andrew Tate, Jordan Peterson, Joe Rogan, Stephan Molyneux, Elliot Hulse, and Ben Shapiro provide ideological justification for misogyny.

    • Some are blatant in their message: women should be controlled, dominated, and stripped of rights.

    • Others disguise their views in intellectual or self-help language, positioning themselves as “rational” while reinforcing the idea that feminism has “gone too far.”

  • Mainstream platforms invite these figures on, legitimising their views under the guise of ‘debate’ or ‘free speech.’

  • The idea that women’s rights come at the expense of men is solidified.

By this stage, misogyny is no longer just an opinion - it’s an identity. Many who reach this point defend these beliefs aggressively and begin to actively spread them.

4. Radicalisation & Real-World Violence (The Action)

Some men transition from talk to action - inspired by figures and online groups that reinforce their rage.

  • Attacks inspired by misogynistic extremism have already occurred:

    • Elliot Rodger (Isla Vista shooting, 2014) - idolised by incel communities.

    • Alek Minassian (Toronto van attack, 2018) - explicitly cited incel ideology.

    • Jake Davison (Plymouth shooting, 2021) - active in misogynistic online communities.

  • Others participate in stalking, harassment, doxxing, and coordinated online abuse of women.

This is where the security industry must step in. This isn’t just "internet culture" - it’s an extremist pipeline that produces real-world threats.

The Men Driving This Pipeline

Online radicalisation doesn’t happen in a vacuum. It thrives because high-profile figures are given free rein to spread their views to millions with little to no accountability.

Figures like Andrew Tate, Jordan Peterson, Joe Rogan, Ben Shapiro, Stephan Molyneux, and Elliot Hulse have played a major role in mainstreaming misogynistic rhetoric. Some, like Tate, are blatant in their advocacy for male dominance and the subjugation of women, openly glorifying control and coercion. Others, like Peterson and Shapiro, cloak their misogyny in intellectualism, positioning themselves as rational critics while undermining feminism, gender equality, and systemic misogyny giving their followers plausible deniability to defend their extremist views.

Joe Rogan, with his massive platform and reach, consistently amplifies voices that push dangerous narratives about women, all while positioning himself as a neutral observer who is “just asking questions.” This has made him one of the most effective gateways for introducing misogynistic and reactionary ideologies to mainstream audiences.

Meanwhile, figures like Stephan Molyneux and Elliot Hulse portray gender equality as an attack on men, feeding resentment and reinforcing the dangerous belief that women’s rights come at the expense of male power. These narratives aren’t just rhetoric, they fuel the radicalisation pipeline that leads to real-world violence.

Free Speech vs. The Freedom to Incite Harm

Let’s be clear: free speech does not mean freedom to incite violence.

And yet, every time this conversation comes up, the same tired arguments resurface:

  • “It’s just words.” No, it’s an ecosystem that grooms young men into believing that women are subhuman.

  • “You can just ignore it.” Women shouldn’t have to endure relentless harassment just to exist online.

  • “Censorship is dangerous.” So is allowing radical misogyny to flourish unchecked.

We’ve set the bar so low for what is considered acceptable online behaviour that some of the most extreme and dangerous ideas now operate in plain sight.

What used to be fringe ideology has been normalised through repetition and reach. And we don’t need to look far to see the consequences.

Addressing This Threat

Social media companies have allowed extremist views to spread unchecked. Governments have failed to take meaningful action. And the security industry has, for the most part, ignored this pipeline entirely.

That needs to change.

This is not just an online culture war - this is a security issue.

Just as we track and counter other forms of extremism, we need to start treating the online hate that fuels violence against women as a real and urgent security threat.

The tools exist to fix this problem. What’s missing is the willingness.

What Should Be Done - and By Whom?

1. Government & Security Services (Home Office, Police, Intelligence Agencies, Counterterrorism Units)

These are the entities responsible for preventing radicalisation, tracking extremist threats, and intervening before violence occurs. They should:

  • Recognise misogynistic extremism as a genuine radicalisation pipeline - just like Islamist or far-right extremism.

  • Track individuals moving through this pipeline - monitor known incel and Red Pill forums in the same way white supremacist and jihadist spaces are monitored.

  • Strengthen legal frameworks to prosecute those who incite violence against women online.

  • Develop deradicalisation programs for young men who are vulnerable to online misogynistic ideologies.

  • Train officers and intelligence analysts to identify and assess misogynistic extremism as a security threat.

2. Corporate Security & Threat Intelligence (Private Sector Security, Risk Management Firms, Online Safety Experts)

Many businesses, institutions, and online platforms play a role in countering online radicalisation and protecting employees or customers from its effects. They should:

  • Improve online threat monitoring. Companies already track threats from terrorists and activists; misogynistic extremism should be included.

  • Implement workplace threat assessments. HR and security teams should be trained to recognise warning signs of radicalisation in employees.

  • Support online safety initiatives. Tech firms should work with threat intelligence teams to prevent misogynistic radicalisation from spreading on their platforms.

  • Secure events and public spaces. Mass shooters and violent misogynists often target specific locations (e.g., universities, public gatherings, female-dominated workplaces). Security planning must account for this risk.

3. Universities & Educational Institutions

Since many men are radicalised into misogynistic extremism while in their late teens/early 20s, schools and universities play a critical role in early intervention. They should:

  • Train staff to identify signs of radicalisation - professors, campus security, and student services should be briefed on how online misogyny escalates into extremism.

  • Run digital literacy and critical thinking programs to counteract online disinformation and extremist recruitment tactics.

  • Partner with online safety and security experts to develop intervention strategies for students engaging with extremist misogynistic content.

4. Social Media Platforms & Tech Companies

Since algorithms drive radicalisation, social media giants must be held accountable for what their platforms promote. They should:

  • Make misogynistic extremism a bannable offense, just like they do for other forms of hate speech and radicalisation.

  • Increase algorithm transparency, so we can see how extreme misogynistic content is pushed to users.

  • Partner with counter-extremism researchers to remove recruitment pipelines from their platforms.

What About Security Professionals in the Private Sector?

Security consultants, risk managers, and corporate security teams shouldn’t dismiss this as just “an online issue.” Misogynistic extremism has already led to real-world attacks, so the industry must:

  • Incorporate misogynistic extremism into risk and threat assessments - not just in terrorism risk profiles but in workplace violence prevention.

  • Educate corporate security teams on emerging threats - women in high-profile positions are being actively targeted by radicalised men.

  • Advocate for legislative change - security organisations should be pushing for stronger legal tools to tackle online extremism and harassment.

Final Thought

This isn’t just about stopping online hate - it’s about preventing radicalisation before it escalates into real-world violence. We need a multi-layered approach, with:

  • Government agencies recognising misogynistic radicalisation as extremism

  • Security services (law enforcement, intelligence) actively tracking these groups

  • Universities and schools preventing early-stage recruitment

  • Tech companies stopping their platforms from being used as recruitment pipelines

  • Corporate security teams integrating this into threat assessments

This is not just a feminist issue - it’s a security and counter-extremism priority. The more we ignore it, the more dangerous it becomes.

The internet didn’t create misogyny. But it has amplified, industrialised, and profited from it.

What we are seeing now is not just casual sexism - it’s a well-organised radicalisation system that is pushing more and more men toward hateful, dehumanising, and violent beliefs.

This cannot be dismissed as “free speech” any longer.

If a new extremist ideology was leading to mass shootings and terror attacks, we would see global action. But because the victims are women, society has been content to dismiss it as “online discourse.”

Enough.

We cannot afford to keep moving the goalposts, making excuses for why this isn’t a crisis. Because it is.

*National Police Chiefs’ Council and College of Policing Report

Previous
Previous

UNO Reverse on Anti-Corruption Compliance? 

Next
Next

The Bookshelf Edit Episode 1