Apple’s decision to have iPhones and other Apple company devices scan photos for child sexual abuse material (CSAM) has sparked criticism from security experts and privacy advocates—and from some Apple employees. But Apple believes its new system is an advancement in privacy that will “enabl[e] a more private world, ” according to Craig Federighi, the company’s senior VP of software engineering.
Federighi defended the new system in an interview with The Wall Street Journal, saying that Apple is aiming in order to detect child sexual abuse photos in a way that protects user privacy more than other, more invasive scanning systems. The Journal wrote today :
While Apple’s new efforts have drawn praise from a few, the company has also received criticism. An executive at Facebook Inc. ‘s WhatsApp messaging service and others, including Edward Snowden, have called Apple’s approach bad for personal privacy. The overarching concern is whether The apple company can use software that identifies illegal material without the system being taken advantage of by others, such as governments, pushing for more private information—a suggestion Apple strongly denies and Mr. Federighi said will be protected against by “multiple levels associated with auditability. ”
“We, who consider ourselves absolutely leading on privacy, see what we are doing here as an development of the state of the art in privacy, as enabling a more private world, inch Mr. Federighi said.
In a video of the interview , Federighi said, “[W]hat we’re doing is we’re finding illegal images of kid pornography stored in iCloud. If you look at any other cloud service, they currently are checking photos by looking at every single photo in the cloud plus analyzing it. We wanted to be able to spot such photos within the cloud without looking at people’s photos and came up with an architecture to do this. ” The Apple system is “much a lot more private than anything that’s been done in this area before, inches he said.