Apple has confirmed plans to delay its controversial CSAM feature.
The company said earlier in the summer thatit would scan users’ iCloud photos for illegal child exploitation images, but backlash from privacy activists has led to the firm changing its mind – at least for now.
Speaking of the news, a spokesperson for the company said: “Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple announced the initiative last month, and as well as scanning users’ iCloud photos, Apple planned to introduce a nudity detection algorithm to iMessage to reduce the chances of children seeing inappropriate content on their iPhones and iPads.
Apple planned to launch the new features in iOS 15, but the rollout has now been paused.
Of course, few would argue that Apple’s plans to protect children and reduce the distribution of illegal child sexual abuse material is a bad thing – but they were concerned over the way the company planned to implement the features. Apple has built a name for itself as being a privacy-focused firm, often firing shots at Facebook and Google for their less-than-stellar approaches to online privacy. Announcing such a significant change to its privacy policy via a press release was perhaps not the best move, and left many questioning exactly how private and secure their iCloud really was.
Are you pleased to hear that Apple is going back to the drawing board on this one? Let us know your thoughts and check back to our website soon for the very latest information.