Site icon AppleMagazine

Controversial EU proposals would require the scanning of private messages to guard against child abuse

European Commission Berlaymont building

The European Commission – the executive branch of the European Union (EU) – has set out new regulation that would oblige chat apps such as Facebook Messenger and WhatsApp to undertake selective scanning of users’ encrypted messages in search of child sexual abuse material (CSAM) and signs of “grooming”.

While, at first glance, the move may seem reminiscent of Apple’s own plans – proposed last year, but later postponed – to introduce scanning for CSAM, critics have said that it would be a lot more invasive than the Cupertino firm’s now-withdrawn proposals.

A draft of the regulation was leaked earlier this week, drawing condemnation from privacy experts. Cryptography professor Matthew Green, for example, tweeted that the document was “the most terrifying thing I’ve ever seen… it describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration.”

However, in its press release outlining the proposed measures, the Commission said: “To effectively address the misuse of online services for the purposes of child sexual abuse, clear rules are needed, with robust conditions and safeguards.

“The proposed rules will oblige providers to detect, report and remove child sexual abuse material on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

Further information on the new rules, and the role of what has been described as “a new independent EU Centre on Child Sexual Abuse”, can be found on the Commission’s website. As set out in the press release, “it is now for the European Parliament and the Council to agree on the proposal.”

If the new regulations do take effect, various obligations would be imposed on “online service providers” – a broad term effectively encompassing app stores, hosting firms, and any provider of “interpersonal communications service”.

Communications platforms such as WhatsApp, Signal and Facebook Messenger would be subject to the most drastic aspect of the regulation: an obligation, in the event of receiving a “detection order” from the EU, to scan select users’ messages seeking out known CSAM, in addition to previously unseen CSAM and any messages that could be regarded as “grooming” or the “solicitation of children”.

By comparison, even Apple’s plans last year to scan messages for CSAM would have only searched for known instances of child sexual abuse material, thereby limiting the potential for error.

Unsurprisingly, privacy groups are up in arms about the extensive scope of the EU’s proposals, amid concerns that – in the words of European Digital Rights (EDRi) policy advisor Ella Jakubowska, as quoted by The Verge – “it completely leaves the door open for much more generalized surveillance.”

Needless to say, whatever your stance may be on the EU’s plans, this is a development that we should all be closely watching, given the implications it could have for similar policies elsewhere in the world.

Exit mobile version