More than 3,189 sexual assault survivors helped since 2023

Helping Survivors

West Virginia Files Lawsuit Against Apple Over CSAM on iCloud

West Virginia Files Lawsuit Against Apple Over CSAM on iCloud

West Virginia Attorney General John “JB” McCuskey has filed a significant lawsuit against Apple Inc., accusing the tech giant of knowingly allowing its iCloud platform to be used for the distribution and storage of child sexual abuse material (CSAM). The lawsuit, filed Thursday in Mason County Circuit Court, claims Apple failed to take meaningful action to prevent CSAM from circulating on its devices, instead prioritizing user privacy over the safety of children.

In the complaint, McCuskey argues that Apple has violated state law by not adequately policing its platform, continuing to offer a service that facilitates the sharing and storage of harmful material. “Preserving the privacy of child predators is absolutely inexcusable. And more importantly, it violates West Virginia law,” McCuskey said. He emphasized that Apple’s inaction had re-victimized children by enabling the sharing of CSAM and failing to report such images. The lawsuit calls for Apple to adhere to the law, report discovered CSAM, and take action to protect vulnerable individuals from further harm.

The allegations suggest that Apple has not fully recognized the gravity of the issue, even as other tech giants have adopted measures to address CSAM more effectively. The case marks a pivotal moment in the growing scrutiny of how digital platforms handle child safety, signaling the potential for significant legal and operational consequences for Apple if it is found to be neglectful in its responsibilities.

Allegations Against Apple

The legal action against Apple comes after McCuskey’s office uncovered troubling internal communications from the tech company. According to the lawsuit, Apple’s own employees described its platform as “the greatest platform for distributing child porn” internally, yet took no substantial steps to mitigate the situation. The company allegedly failed to implement basic industry-standard CSAM detection tools, such as PhotoDNA, that are routinely used by other tech companies to combat child exploitation.

Instead, Apple reportedly cited user privacy concerns as the reason for not employing such tools, prioritizing its public stance on privacy over the protection of children. PhotoDNA, for example, is widely used by companies like Google and Meta, which have adopted proactive measures to identify and prevent the distribution of CSAM. In contrast, Apple’s reluctance to integrate similar systems has raised questions about its commitment to the safety of its users, particularly children, who are the most vulnerable.

Federal law mandates that technology companies in the U.S. report any detected CSAM to the National Center for Missing and Exploited Children (NCMEC). However, Apple’s reported 267 submissions of CSAM in 2023 are starkly contrasted with the millions of reports submitted by Google and Meta. Google filed over 1.47 million reports, while Meta filed more than 30.6 million. This disparity further strengthens the argument that Apple has failed to meet the industry standards for handling such sensitive content.

Internal Communications Raise Alarms

Despite the acknowledgment of this critical issue within the company, Apple is accused of taking little meaningful action to address the problem, even as CSAM continued to be distributed and stored on its platform.

Apple’s internal documents reportedly show that the company was well aware of the scale of CSAM on its platform but failed to act decisively. Unlike its competitors, such as Google, Microsoft, and Dropbox, which have been proactive in using technologies like PhotoDNA to detect and block CSAM, Apple allegedly did not prioritize or implement such safeguards. Apple’s continued reliance on its privacy branding, according to the lawsuit, appears to have outweighed the urgency of child safety, leaving vulnerable children exposed to further harm.

McCuskey and other advocates for stronger child safety protections argue that Apple’s refusal to act on these internal alarms has had real-world consequences. The lawsuit contends that by not implementing detection tools or taking sufficient action, Apple has contributed to the perpetuation of CSAM distribution on its devices and cloud infrastructure. This failure not only violates the legal obligation to report such content but also undermines the broader effort to combat online child exploitation.

Failure to Report CSAM

A key component of the lawsuit is Apple’s failure to report discovered CSAM to the National Center for Missing and Exploited Children (NCMEC), as required under federal law. In 2023, Apple reportedly submitted just 267 CSAM reports, a stark contrast to the millions of reports filed by other tech companies. For example, Google submitted over 1.47 million reports, and Meta submitted more than 30.6 million, according to the lawsuit. This discrepancy raises serious concerns about Apple’s commitment to child safety and its adherence to legal requirements.

The lawsuit argues that because Apple controls its entire ecosystem—including hardware, software, and cloud infrastructure—it cannot claim ignorance of its role in facilitating the distribution and storage of CSAM. By not reporting these materials, Apple is accused of violating both federal law and its moral obligation to protect children from exploitation. The lawsuit asserts that Apple’s decision to minimize its reporting of CSAM cases undermines the effectiveness of national efforts to combat child sexual abuse online.

McCuskey’s office contends that Apple has put its privacy policies above the need for child protection, choosing to protect user privacy at the expense of potentially identifying and stopping the spread of harmful content. The lack of proactive measures to detect CSAM, combined with the company’s underreporting, paints a troubling picture of Apple’s response to the issue, prompting the West Virginia Attorney General to take legal action in order to hold the company accountable.

A Landmark Lawsuit

The lawsuit filed by the West Virginia Attorney General marks a historic moment in the ongoing battle against the online distribution of CSAM. It is the first legal action of its kind brought by a governmental agency against Apple specifically over the company’s handling of child sexual abuse material. McCuskey’s office claims that Apple’s failure to take adequate action has allowed CSAM to spread unchecked, leaving children vulnerable to further harm.

This case highlights a growing tension between user privacy and the need to protect children from online exploitation. While Apple has long been a staunch advocate for user privacy, the lawsuit contends that this priority has been taken too far, with disastrous consequences for child safety. The West Virginia lawsuit also draws attention to the broader issue of tech companies’ responsibilities to ensure the safety of their platforms and the children who use them.

The case is likely to have far-reaching implications, not only for Apple but also for the broader tech industry. If successful, it could force Apple and other companies to reconsider their approach to privacy and child safety, potentially setting new legal precedents for how tech companies handle CSAM. Advocates for stronger protections hope that the lawsuit will prompt tech companies to adopt more comprehensive safeguards to protect children from online abuse.

Apple’s Response to the Lawsuit

In response to the lawsuit, Apple has reiterated its commitment to ensuring the safety and privacy of its users, particularly children. A spokesperson for the company issued a statement to CNBC, asserting that “protecting the safety and privacy of our users, especially children, is central to what we do.” Apple pointed to various safety features already in place, such as parental controls and its Communication Safety feature, which automatically detects nudity in messages, photos, AirDrop, and FaceTime calls.

Despite these efforts, the lawsuit argues that Apple’s response is insufficient, particularly when compared to the actions taken by other tech companies to combat CSAM. Apple’s 2021 announcement of a new CSAM detection system for its Photos app was met with backlash from privacy advocates, leading the company to abandon the initiative. Apple’s hesitation to implement a robust CSAM detection system, despite its significant control over its ecosystem, has drawn criticism from both legal experts and child safety advocates alike.

Apple’s statement highlights its ongoing innovation to combat “ever-evolving threats” and maintain the “safest, most trusted platform for kids.” However, critics argue that Apple’s measures fall short of the comprehensive action required to effectively address the problem of CSAM distribution across its services. As the lawsuit progresses, all eyes will be on Apple’s response and whether the company is forced to take more substantial steps to protect children from online harm.

The Future of CSAM Detection and Child Safety

The outcome of this lawsuit could have profound implications for the tech industry’s approach to child safety. If successful, the case may force Apple and other companies to reconsider how they balance privacy with the protection of vulnerable children from online exploitation. As the digital landscape continues to evolve, there is growing pressure on tech companies to adopt stronger safeguards to prevent the spread of CSAM and other harmful content.

The legal action also shines a light on the increasing need for a regulatory framework that requires tech companies to take more responsibility in preventing online child abuse. The West Virginia lawsuit has already sparked conversations about what constitutes a reasonable level of action to combat CSAM, and what steps companies should be required to take to detect and report it. Advocates hope that the case will set a precedent for future litigation and policy decisions that prioritize child safety without compromising privacy protections.

For survivors of sexual abuse, understanding their rights and access to resources is critical. Helping Survivors remains committed to supporting individuals affected by these issues, providing legal guidance, access to resources, and empowering them in their path forward to healing and justice.

Have you experienced sexual assault or abuse?
Helping Survivors can connect you with an attorney if you may have a case. While we cannot report a crime on your behalf, your safety is important. Please contact your local authorities for further assistance.

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Want To Speak With A Lawyer?

Understand your legal rights and options as a survivor of sexual assault and abuse.
white man in suit smiling
helping survivors badge