The office of West Virginia’s attorney general has filed a ground-breaking lawsuit against Apple Inc, alleging that the tech giant’s iCloud platform has played a significant role in the distribution of child sexual abuse material (CSAM).
The complaint, lodged on Thursday in Mason County Circuit Court, accused Apple of prioritising user privacy over child safety, Reuters reported.
‘Greatest platform for distributing child porn’
According to Attorney General JB McCuskey, a Republican, internal communications from Apple described iCloud as the “greatest platform for distributing child porn,” a statement that McCuskey’s office said underscored the severity of the issue.
“These images are a permanent record of a child’s trauma, and that child is re-victimised every time the material is shared or viewed,” McCuskey said.
“This conduct is despicable, and Apple’s inaction is inexcusable,” he added.
What has the lawsuit alleged?
The lawsuit alleged that Apple allowed iCloud to be used as a vehicle for storing and circulating CSAM because it did not implement proactive safeguards similar to those used by other major tech firms.
Companies such as Google and Microsoft routinely check uploaded images against databases of known CSAM identifiers provided by organisations like the National Center for Missing and Exploited Children (NCMEC).
According to West Virginia’s complaint, Apple’s reported measures were far less robust, with only 267 CSAM reports made in 2023, compared to millions by its peers.
The lawsuit is seeking statutory and punitive damages and asked a judge to compel Apple to adopt “more effective measures” for detecting abusive content and redesign products to enhance user safety.
It also mirrored allegations in a proposed class action filed in late 2024 by individuals whose images appeared in CSAM found on iCloud, a case Apple has sought to dismiss under Section 230 of the Communications Decency Act, which protects tech platforms from liability for user-generated content.
We have incorporated child-safety features: Apple
Apple has defended its privacy-centric approach, saying it has incorporated child safety features such as Communication Safety, which blurs nudity detected in messages and other shared content on children’s devices. The company also considered an image-scanning tool called ‘NeuralHash’ in the past but abandoned it amid privacy and security concerns, including fears it could be misused for censorship or broader surveillance.
