A young person whose family thought they were playing video games was actually hoarding thousands of images of child exploitation, violent extremism, gore and animal cruelty.
The case was outlined in the Digital Violent Extremism Transparency report released on 26 March.
It showed Internal Affairs investigators were tackling a rising tide of violent content.
Investigators got 60 alerts about the young person. When they raided their home, it came as a “major shock” to their family, the report said.
“The family of the individual had no prior concerns… they thought the individual was solely online gaming, although their online time was unsupervised.”
The young person had quickly become engrossed by extremist content, reinforced by others online.
The case was exposed by the National Center of Missing and Exploited Children, which alerted Operation Flare, a DIA operation last year that looked into groups sharing objectionable content.
Referrals about content were up by a quarter, the report showed, to almost 900 in 2023.
About 40 percent of the content flagged was found to be objectionable.
“The most commonly reported ideology type was ‘Identity motivated’, specifically ‘white-identity’.”
The platforms X/Twitter, Telegram and TikTok got the most complaints – although TikTok removed over three-quarters of harmful content before HE had to formally request it.
Just over half the content referred to had to do with the 15 March mosque attacks, including from a Russia-owned platform, where the department succeeded in part by working with Europol to get it removed.
Informal and formal takedown requests got several hundred lots of content removed.