Site icon AppleMagazine

Apple employees raise concerns over new CSAM tool

iPhone

A group of Apple employees have expressed concern over the firm’s new child safety features.

Apple announced last week plans to scan users’ smartphones for child pornography with iOS 15, but now some insiders have suggested it could tarnish the company’s reputation and lead to pushback on privacy features.

The new iOS 15 feature will automatically scan images uploaded to iCloud Photos for child sexual abuse material, as well as protect children from sensitive images and content sent inside the iMessage app. Siri and Search have also been updated to deal with unsafe situations, ultimately helping children avoid online harm.

Since the CSAM measures were announced, dozens of Apple employees have posted almost 1,000 messages to a Slack channel on the subject. Staffers are reportedly concerned about the measures leading to potential government exploitation, something that Apple mentioned in support documents when it announced the feature.

It’s important to note that pushback does not appear to be coming from employees who are part of Apple’s security and privacy teams. According to sources at Reuters, some of those staffers had defended Apple’s decision, suggesting that the new system was a reasonable response and that it could go on to protect lives.

Although few are denying the benefits Apple’s new service will bring to the wider world, many are concerned about the mission creep of the scheme and violations of user privacy.

Where do you stand on this matter? Let us know and check back soon for the latest news.

Exit mobile version