Why Are We Chasing Shadows Online Instead of Confronting the Monsters Who Make Them?

May 6, 2025

Every week, new funding is announced to tackle online harm. Millions of pounds, thousands of hours, entire departments built around keeping us safe online. We build filters, develop AI tools, draft child safety bills, and train teachers and parents. We warn, we report, we post, we flag. And still, the harm floods in.

I can’t help but wonder: why are we so consumed with controlling the fire, yet rarely ask who lit the match.

There is a glaring silence at the heart of online safety efforts, a refusal to look directly at the individuals who create harmful content. Who are they? Why do they do it? And why are so few ever stopped?

In the real world, if someone distributed graphic abuse material of children, police would knock down doors. If someone followed children to a playground whispering threats, they would be arrested. If someone filmed women without consent, uploaded footage, encouraged suicide, or set up secret clubs that glorify violence, we would call them criminals. Possibly psychopaths. We would want names. We would ask what made them that way.

So why does the online world get treated differently?

We have spent years talking about how to protect children from seeing violent, pornographic, or disturbing content. Yet we do not ask who is making it. Who is uploading it at 2 in the morning? Who is obsessively editing, tagging, and distributing it? Who makes the accounts encouraging thirteen-year-olds to starve themselves or self-harm? These things do not appear by accident. They are created and then shared.

We are quick to ask how content slips through moderation filters. To me, it feels like we are far too slow to ask who created it in the first place.

If we treated real-world crime this way, we would be a laughing stock. Imagine discovering a town plagued by arson attacks, and the entire government response was to improve smoke alarms and teach children how to stop, drop, and roll. No one would dream of not hunting down the arsonist.

When serial killers like Dennis Rader, Ian Brady, or Ted Bundy were active, society did not just shield the public. We wanted to understand them. Profile them. Stop them. Even in the horror, there was analysis, attention, and eventually, justice. But in the online realm, the predators often remain shadows, hiding behind usernames and profile pictures, free to roam, adapt, and evolve and continue, including “inspiring others”…

The UK Online Safety Act aims to fine platforms that host illegal content. That is a start. But that alone is like fining a street for having a mugger on it.

We need to ask this government, and tech companies, far harder questions.

Why are we not identifying the people behind the most harmful material?
Why are there so few prosecutions for persistent online abuse?
Why do predators on forums, encrypted apps, and gaming servers face less accountability than burglars or street harassers?

The problem is not just the internet. It is the hands that weaponise it. It is the minds that enjoy causing harm, hidden behind usernames. And the systems that let them.

Let us not accept a world where we endlessly clean up the mess, but never catch the person lighting the match.

Let us stop chasing shadows, and start asking, who is making the matches and selling them?

Photo by Hayley Murray

Cramp, rugby,
Rugby

Cramp or Crafty Distraction?

https://www.walesonline.co.uk/sport/rugby/rugby-news/disgraceful-incident-sparks-fury-across-31763595 In a United Rugby Championship (URC) quarter-final that had everything, drama, tension, and a touch of controversy: Sharks scrum-half Jaden Hendrikse became the centre

Read More »