Tech

Amid backlash, Apple will change photo-scanning plan but won’t drop it completely

Close-up shot of female finger scrolling on smartphone screen in a dark environment.

Enlarge (credit: Getty Images | Oscar Wong)

Apple said Friday that it will make some changes to its plan to have iPhones and other devices scan user photos for child-sexual-abuse images. But Apple said it still intends to implement the system after making “improvements” to address criticisms.

Apple provided this statement to Ars and other news organizations today:

Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material [CSAM]. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

The statement is vague and doesn’t say what kinds of changes Apple will make or even what kinds of advocacy groups and researchers it will collect input from. But given the backlash Apple has received from security researchers, privacy advocates, and customers concerned about privacy, it seems likely that Apple will try to address concerns about user privacy and the possibility that Apple could give governments broader access to customers’ photos.

Read 9 remaining paragraphs | Comments