Apple faces accusations of underreporting child sexual abuse material on its platforms

Apple is under scanner for allegedly failing to adequately flag and report instances of child sexual abuse material (CSAM) on its services, despite abandoning controversial plans to scan iCloud for such content last year.

The company is now said to be aiming to have development of iOS 18.0 completed by the end of July
The company is now said to be aiming to have development of iOS 18.0 completed by the end of July

Apple is under scanner for allegedly failing to adequately flag and report instances of child sexual abuse material (CSAM) on its services, despite abandoning controversial plans to scan iCloud for such content last year.

According to reports, child safety experts, including the UK’s National Society for the Prevention of Cruelty to Children (NSPCC), have accused Apple of significantly undercounting the prevalence of CSAM exchanged and stored on iCloud, iMessage, and FaceTime.

According to a report by The Guardian, there are discrepancies between Apple’s reported global figures and the number of CSAM cases investigated in the UK alone. Between April 2022 and March 2023, UK authorities recorded 337 offenses involving child abuse images linked to Apple’s platforms. In contrast, Apple reported only 267 instances of CSAM globally to the National Center for Missing & Exploited Children (NCMEC) for the entire year of 2023.

This disparity has raised concerns as Apple’s reported figures are notably lower compared to its tech industry peers like Meta (Facebook) and Google, which report millions of CSAM cases annually to NCMEC. Experts and child safety advocates argue that Apple’s apparent underreporting of CSAM cases underscores the need for significant improvements in its child safety measures.

US-based technology companies are required to report all instances of child sexual abuse material (CSAM) they detect on their platforms to the National Center for Missing & Exploited Children (NCMEC). This organisation serves as a central hub for receiving and distributing reports of child abuse worldwide to relevant law enforcement agencies. Despite the encryption of messages on iMessage, making it impossible for Apple to access user content, similar encryption exists on Meta’s WhatsApp. In 2023 alone, WhatsApp reported approximately 1.4 million suspected CSAM incidents to NCMEC.

This article was first uploaded on July twenty-three, twenty twenty-four, at nine minutes past seven in the evening.