We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

Alamy Stock Photo

The hidden harm How modern pornography is warping young minds and fuelling criminal pathways

Mick Moran of the Irish Internet Hotline reporting service tackles the difficult topic of the growing use of violent and abusive online pornography.

IN A RECENT statement, the Garda Commissioner, Drew Harris, raised a chilling concern: modern pornography is not just reshaping sexual expectations — it’s actively pushing young men toward riskier, more violent behaviour in the bedroom. Work by the S.E.R.P institute in UCD shows that these concerns are valid and evidence-based.

Pornography may also play a more disturbing role in society, with growing evidence linking it to the possession and distribution of child sexual abuse material (CSAM). This urgent issue also requires immediate attention and further research. 

The accessibility of pornography today is unprecedented. With just a few clicks, anyone can access an endless stream of content from the most anodyne to the most extreme. But this convenience comes with a dark side. A study published in Child Abuse & Neglect  in 2024 found that many CSAM offenders began their journey not by seeking illegal material, but by consuming legal pornography that gradually escalated in extremity. The transition was often unintentional and driven by curiosity, emotional distress, or the lure of taboo content like “teen” or “incest” along with the perception of anonymity and ease of access.

Descent into abusive content

This progression is not just anecdotal. Research from the University of Edinburgh revealed three distinct pathways to CSAM offending: (1) escalation from legal pornography, (2) online interactions that normalise illegal content, and (3) direct access without prior exposure.

The first pathway — escalation from legal content — is particularly troubling because it implicates mainstream platforms and algorithms. Offenders often cite the fact, “It’s just there, right? That must make it okay”. They often don’t think beyond that — until suddenly, they’re criminals. 

Modern pornography platforms are powered by recommendation engines designed to maximise engagement. These algorithms don’t just reflect user preferences — they shape them. As users consume more content, the system nudges them toward increasingly extreme material. This phenomenon, known as “algorithmic escalation,” has been documented across all digital platforms, including commercial pornography platforms.

In the context of pornography, it can mean a descent from consensual adult content into violent, degrading or pseudo-illegal themes. It can often lead to a dark rabbit hole, via darknets and other libertarian platforms, to illegal abuse material and a date with the criminal justice system.

This leads to catastrophic results not only for the victim who is abused to produce this material but also for the offender and their families. The families of CSAM offenders are hidden victims. We all know that children are sexually abused to produce CSAM and can imagine that the lives of offenders are often changed radically through loss of job and reputation on arrest, but what people don’t see is the toll on the families of offenders.

They are the ones who must face their neighbours, friends and people down the GAA club after one of their own has been on the front page of the newspapers; jailed for possession of CSAM.   

Historically, individuals found in possession of child sexual abuse material (CSAM) were typically assumed to have a sexual interest in children. However, in recent times, more difficult questions must be posed. Is it possible that sexual interest only extends to the act of masturbation, the dopamine hit, and not into the real world? Could commercial pornography sites be creating a pipeline to CSAM offending? A growing body of research is indicating that some viewers of CSAM found themselves at this point without searching for abuse images in the first place. 

Change is possible

The good news is that many offenders are receptive to change. Situational crime prevention — like warning messages, blocked sites or social support —  works and can often disrupt offending patterns. This suggests that prevention is possible, but it requires a multipronged approach, including the assistance of the Very Large Online Platforms and Search Engines.

First, though, we need to regulate algorithms that promote extreme content. Platforms must be held accountable for the material they recommend and the pathways they create. Second, we must invest in education that helps young people critically engage with sexual media. Adolescents are not passive consumers — they can be taught to recognise and resist harmful content. They can be taught to be pornography critical but sex positive.

Finally, we must expand access to support services for those struggling with pornography addiction, especially when it borders on illegality. Organisations like One in Four in Ireland and Stop It Now! in the UK offer confidential help to individuals who fear they may offend. These services are not just compassionate, they’re essential to public safety.

As CEO of the Irish Internet Hotline, I see firsthand the devastating impact of illegal online content, especially CSAM. But I also see hope. Every day, people make an effort to report what they’ve seen, and in doing so, they help protect children and disrupt criminal networks through the excellent team we have here.

If you come across illegal content online, including CSAM, report it immediately at www.hotline.ie. Reports can be made anonymously and can result in the identification of the child featured within the CSAM, which represents possibly their only chance of being removed from the harmful environment and the sexual abuse. 

If you’re struggling with your own online pornography or CSAM use, please know that help is also available. Seeking support is not a sign of weakness — it’s a courageous step toward accountability and healing. No one is responsible for what turns them on, but everyone is responsible for their own behaviour. 

We all have a role to play in making the internet safer. Let’s start by acknowledging the risks, confronting the realities, and supporting those who choose to do the right thing.

Mick Moran is the CEO of the Irish Internet Hotline, Ireland’s national reporting service for illegal content online. With decades of experience in international law enforcement and child protection, he advocates for responsible internet use, ethical tech design, and compassionate intervention for those at risk of offending.

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Close
JournalTv
News in 60 seconds