Apple Delays Rollout of CSAM Photo Scanning System Due to Privacy Concerns

Apple Delays Plan for CSAM Photo Scanning System
credit: MichaelJayBerlin / Shutterstock.com

As part of its efforts to improve child safety features, Apple revealed its plans to scan iCloud Photos for potential Child Sexual Abuse Material (CSAM) earlier last month. Following backlash from security experts and digital rights groups like Electronic Frontier Foundation, Apple has now delayed the rollout of CSAM detection.

Apple Delays Rollout of CSAM Detection

Apple was initially all set to roll out CSAM detection later this year. It is applicable to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey. The Cupertino giant has not revealed the new date for rolling out the feature just yet. Apple has also not detailed on what aspect of CSAM detection it is planning to improve or how it will approach the feature to offer a healthy balance between privacy and safety.

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” said Apple in an official statement.

To recall, Apple’s CSAM detection works on-device and doesn’t scan images in the cloud. It attempts to detect known CSAM image hashes provided by NCMEC and other child safety organizations. This on-device matching process occurs right before you upload an image to iCloud Photos. However, researchers have since discovered hash collisions that could essentially detect images as false positives. It was also revealed that Apple has been scanning iCloud Mail for child abuse since 2019.

VIA MacRumors
Comments 0
Leave a Reply

Loading comments...