Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool

Ars Technica 2024-12-09

Thousands of victims have sued Apple over its alleged failure to detect and report illegal child pornography, also known as child sex abuse materials (CSAM).

The proposed class action comes after Apple scrapped a controversial CSAM-scanning tool last fall that was supposed to significantly reduce CSAM spreading in its products. Apple defended its decision to kill the tool after dozens of digital rights groups raised concerns that the government could seek to use the functionality to illegally surveil Apple users for other reasons. Apple also was concerned that bad actors could use the functionality to exploit its users and sought to protect innocent users from false content flags.

Child sex abuse survivors suing have accused Apple of using the cybersecurity defense to ignore the tech giant's mandatory CSAM reporting duties. If they win over a jury, Apple could face more than $1.2 billion in penalties. And perhaps most notably for privacy advocates, Apple could also be forced to "identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent continued dissemination of CSAM or child sex trafficking on Apple devices and services." That could mean a court order to implement the controversial tool or an alternative that meets industry standards for mass-detecting CSAM.

Read full article

Comments