
Tech giants are failing to track the reports of online child sexual abuse despite figures suggesting more than 16 million photos and videos were found on the platforms.
Subscribe now for unlimited access.
or signup to continue reading
See subscription optionsAn eSafety report has revealed that Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snapchat and Skype aren't doing enough to crackdown on online child sexual abuse despite repeated calls for action.
It comes three years after the Australian watchdog found the platforms weren't proactively detecting stored abuse material or using measures to find live-streams of child harm.

"While there are a couple of bright spots, basically, most of the big ones are not lifting their game when it comes to the most heinous crimes against children," Commissioner Julie Inman Grant told ABC radio on Wednesday.
The latest report revealed Apple and Google's YouTube weren't tracking the number of user reports about child sexual abuse, nor could they say how long it took to respond to the allegations.
The companies also didn't provide the number of trust and safety staff.
The US National Centre for Missing and Exploited Children suggests there were tip-offs about more than 18 million unique images and eight million videos of online sexual abuse in 2022.
"What worries me is when companies say, 'We can't tell you how many reports we've received' ... that's bollocks, they've got the technology," Ms Inman Grant said.
"What's happening is we're seeing a winding back of content moderation and trust and safety policies and an evisceration of trust and safety teams, so they're de-investing rather than re-upping."
It comes as YouTube has been arguing against being included in a social media ban for under-16-year-olds on the basis that it is not a social media platform but rather is often used as an educational resource.
The watchdog commissioner had recommended YouTube be included based on research that showed children were exposed to harmful content on the platform more than on any other.

Meanwhile, other findings in the new report include that none of the giants had deployed tools to detect child sexual exploitation livestreaming on their services, three years after the watchdog first raised the alarm.
A tool called hash matching, where copies of previously identified sexual abuse material can be detected on other platforms, wasn't being used by most of the companies, and they are failing to use resources to detect grooming or sexual extortion.
There were a few positive improvements with Discord, Microsoft and WhatsApp generally increasing hash-matching tools and the number of sources to inform that technology.
"While we welcome these improvements, more can and should be done," Ms Inman Grant said.
This report is part of legislation passed last year that legally enforces periodic transparency notices to tech companies, meaning they must report to the watchdog every six months for two years about tackling child sexual abuse material.
The second report will be available in early 2026.
Australian Associated Press