Skip to content
Thoughtful, detailed coverage of everything Apple for 34 years
and the TidBITS Content Network for Apple professionals

Series: CSAM Detection

Glenn Fleishman Rich Mogull 114 comments

FAQ about Apple’s Expanded Protections for Children

Apple is piercing the privacy veil on our devices to protect children. The company claims its efforts won’t open up a Pandora’s Box in the interests of averting sexual exploitation of children or recognition of sexual material handled by children under 18 when a parent wants oversight. But it’s a big change from its previous absolutist stance in favor of user privacy.

Adam Engst 74 comments

New CSAM Detection Details Emerge Following Craig Federighi Interview

In an interview with Joanna Stern of the Wall Street Journal, Apple software chief Craig Federighi said the company would be applying “multiple levels of auditability” to its controversial child sexual abuse material (CSAM) detection system. What are they? Another new Apple document explains.

Adam Engst 8 comments

Apple Delays CSAM Detection Launch

In a brief statement to media organizations, Apple announced that it is delaying the launch of its CSAM detection technology to collect input and make improvements. Why are we not surprised?

Adam Engst 4 comments

Apple Explains Pullback from CSAM Photo-Scanning

In a letter responding to a child safety group, Apple has outlined its reasons for dropping its proposed scanning for child sexual abuse material in iCloud Photos. Instead, the company is focusing on its Communication Safety technology, which detects nudity in transferred images and videos.