Apple is facing a lawsuit over its decision not to deploy a system that would scan iCloud photos for child sexual abuse material (CSAM). The suit alleges that by not implementing stronger measures to detect and limit the distribution of such material, the company is perpetuating harm by forcing victims to relive their trauma.
The lawsuit claims Apple initially announced “a widely touted improved design aimed at protecting children” but ultimately failed to “implement those designs or take any measures” to address CSAM effectively.
In 2021, Apple had proposed a system that would use digital signatures provided by the National Center for Missing and Exploited Children and similar organizations to identify known CSAM content in iCloud libraries.
However, the company appeared to abandon the plan after facing backlash from security and privacy advocates, who raised concerns that such a system could create a backdoor for government surveillance.
The legal action reportedly originates from a 27-year-old woman filing under a pseudonym. She alleges she was molested as an infant by a relative who shared images of the abuse online. To this day, she claims to receive near-daily notifications from law enforcement about individuals being charged with possessing those images.
James Marsh, an attorney involved in the case, indicated that as many as 2,680 potential victims might be eligible for compensation if the lawsuit succeeds.
When asked for comment, an Apple spokesperson told that the company is “urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.”
This lawsuit follows a similar legal action filed in August by a 9-year-old girl and her guardian, who accused Apple of failing to address the presence of CSAM on iCloud.