Apple has confirmed plans to delay its controversial CSAM feature.
The company said earlier in the summer thatit would scan users’ iCloud photos for illegal child exploitation images, but backlash from privacy activists has led to the firm changing its mind – at least for now.
Speaking of the news, a spokesperson for the company said: “Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple announced the initiative last month, and as well as scanning users’ iCloud photos, Apple planned to introduce a nudity detection algorithm to iMessage to reduce the chances of children seeing inappropriate content on their iPhones and iPads.
Apple planned to launch the new features in iOS 15, but the rollout has now been paused.
Are you pleased to hear that Apple is going back to the drawing board on this one? Let us know your thoughts and check back to our website soon for the very latest information.