Apple announced Friday that it will delay the rollout of its controversial new child pornography protection tools, accused by some of undermining the privacy of its devices and services.
The Silicon Valley giant had last month said iPhones and iPads would soon start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States.
However, digital rights organizations quickly noted the tweaks to Apple’s operating systems create a potential “backdoor” into gadgets that could be exploited by governments or other groups.
Apple, in announcing the delay, cited the feedback from customers, advocacy groups, researchers and others.
“We have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement.
New technology allows the software powering Apple mobile devices to match abusive photos on a user’s phone against a database of known child sex abuse images provided by safety organizations, then flag the images as they are uploaded to Apple’s online iCloud storage, according to the company.
The system, if and when it goes into use, would be “powered by a cryptographic technology” to determine “if there is a match without revealing the result,” unless the image was found to contain depictions of child sexual abuse.