Child protection advocates are praising Apple's plan to roll out new anti-child-pornography features later this year—but privacy advocates worry about what else it might lead to. The company says the software, called neuralMatch, will scan images on a user's iPhone before they are uploaded to the iCloud storage service to see if they match images in a database of child sexual abuse images, the Wall Street Journal reports. If there is a match, the phone will be disabled and the user will be reported to the National Center for Missing and Exploited Children, which works with law enforcement agencies.
- Context. The Washington Post says the type of matching being done is something companies like Facebook already do. "But in those systems, photos are scanned only after they are uploaded to servers owned by companies like Facebook." In looking at what's on a user's device, Apple is treading into new "client-side" surveillance territory.