In 2021, Apple unveiled a contentious iCloud feature intended to search iMessage and iCloud photos for Child Sexual Abuse Material (CSAM), but they swiftly retracted it.
Privacy Issues Arise
This system would have allowed Apple to examine images on devices belonging to children. However, the company faced significant backlash from privacy advocates and experts, which led them to abandon the project. Apple stated they would "take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
Lawsuit Emerges
Since then, Apple has been silent regarding any developments related to CSAM. Recently, a lawsuit from a victim in the US District Court for Northern California claims that Apple's lack of safety measures has allowed inappropriate circulation of her images online. First covered by The New York Times, a 27-year-old woman revealed that she and her mother have been inundated with notifications about various individuals charged with possession of such material. The lawsuit seeks financial restitution for 2,680 victims whose images have been shared without consent.
Apple's Response
An Apple representative, Fred Sainz, communicated to Engadget that CSAM "is abhorrent, and we are committed to fighting the ways of predators put children at risk." He emphasized that the company is "urgently and actively" seeking methods "to combat these crimes without compromising the security and privacy of all our users."