Apple is facing a lawsuit (PDF) filed Thursday by West Virginia Attorney General JB McCuskey over allegations that iCloud is being used to store and distribute child sexual abuse material online. McCuskey alleges that Apple knew about this «for years» and «chose to do nothing about it.»
The lawsuit contains alleged iMessage screenshots between Apple executives Eric Friedman and Herve Sibert acknowledging the storage and distribution of CSAM on iCloud back in February 2020.
«In an iMessage conversation about whether Apple might be putting too much emphasis on privacy and not enough on trust and child safety, Friedman boasted that iCloud is ‘the greatest platform for distributing child porn’ and that Apple has ‘chosen to not know in enough places where we really cannot say,'» the lawsuit alleges.
«In the same conversation,» it continues, «Friedman referred to a New York Times article about CSAM detection and revealed that he suspects Apple is underreporting the size of the CSAM issue it has on its products.»
The lawsuit points to the number of reports of detected CSAM made to the National Center for Missing and Exploited Children in 2023 by Apple (267), compared to Google (1.47 million) and Meta (30.6 million).
The lawsuit alleges Apple failed to implement CSAM detection tools, including a proprietary scanning tool it had been working on. In 2021, Apple kicked off an initiative to scan images stored on iCloud for CSAM, which it had abandoned by the following year.
The role of end-to-end encryption
It also points to Apple’s security offering Advanced Data Protection, which became available in December 2022 on iCloud and allows end-to-end encryption of photos and videos on the cloud-storage platform. The lawsuit alleges that end-to-end encryption is «a barrier to law enforcement, including the identification and prosecution of CSAM offenders and abusers.»
«Preserving the privacy of child predators is absolutely inexcusable,» McCuskey said in a statement Thursday. «Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images and stop re-victimizing children by allowing these images to be stored and shared.»
Apple told CNET that «safety and privacy» is at the center of its decisions, especially for children.
«We are innovating every day to combat ever-evolving threats and maintain the safest, most trusted platform for kids,» Apple said on Thursday. «All of our industry-leading parental controls and features, like Communication Safety — which automatically intervenes on kids’ devices when nudity is detected in Messages, shared Photos, AirDrop and even live FaceTime calls — are designed with the safety, security and privacy of our users at their core.»
The Communication Safety feature is on by default for users under 18. It attempts to protect children from CSAM content, but does not address adults who deal in CSAM distribution and storage.
The balance between privacy and security on one side, and law enforcement and cybercrime on the other, has been at the center of the debate over end-to-end encryption.
Privacy advocates like the Electronic Frontier Foundation applauded the introduction of encryption to iCloud in 2022, noting that «constant scanning for child abuse images can lead to unwarranted investigations and false positives.» It pointed to protections for people’s sensitive iCloud data, such as photos, against potential cloud data breaches and government demands.
«Blocking the use of end-to-end encryption would be counterproductive and antithetical to the security and privacy of everyone online,» EFF Security and Privacy Activist Thorin Klosowski said in a statement. «Encryption is the best method we have to protect privacy online, which is especially important for young people.»
Data breaches are on the rise, as are government and law enforcement agency requests for user data for various reasons. You can see Apple’s transparency report on how many government requests it receives for user data, though it appears to cap out at December 2024.
End-to-end encryption is also used by Google for its messaging services, as well as popular messaging apps like WhatsApp, Signal and Telegram.
The complaint was filed in the Circuit Court of Mason County, West Virginia, on Feb. 19.
It follows a class-action lawsuit filed at the end of 2024 in a Northern California District Court by 2,680 plaintiffs who allege that Apple’s abandoned CSAM-scanning software amounts to the tech giant knowingly allowing its distribution and storage on iCloud. In August 2024, a similar lawsuit was filed on behalf of a 9-year-old sexual assault victim in North Carolina.

