Apple is piercing the privacy veil on our devices to protect children. The company claims its efforts won’t open up a Pandora’s Box in the interests of averting sexual exploitation of children or recognition of sexual material handled by children under 18 when a parent wants oversight. But it’s a big change from its previous absolutist stance in favor of user privacy.
In an interview with Joanna Stern of the Wall Street Journal, Apple software chief Craig Federighi said the company would be applying “multiple levels of auditability” to its controversial child sexual abuse material (CSAM) detection system. What are they? Another new Apple document explains.
In a brief statement to media organizations, Apple announced that it is delaying the launch of its CSAM detection technology to collect input and make improvements. Why are we not surprised?
In a letter responding to a child safety group, Apple has outlined its reasons for dropping its proposed scanning for child sexual abuse material in iCloud Photos. Instead, the company is focusing on its Communication Safety technology, which detects nudity in transferred images and videos.