The EU chat control law represents a significant shift in how digital communications are monitored and regulated. Aimed at detecting and combating child sexual abuse material (CSAM), the EU chat control law encompasses several key components that collectively aim to enhance online safety while raising substantial privacy concerns, including encrypted ones.
A quick recap of the EU chat control lawIntroduced in 2022, the law seeks to implement an “upload moderation” system that scans all digital messages, encompassing images, videos, and links. Services required to use this “vetted” monitoring technology must also secure user consent to scan messages. If users refuse, they will be barred from sharing images or URLs. Key aspects of this system include:
One of the chat control law’s most contentious elements is its approach to end-to-end encryption. The law appears to both support and undermine encryption.
The EU chat control law aims to enhance online safety by mandating the scanning of digital messagesThe legislation acknowledges that end-to-end encryption is vital for protecting fundamental rights and ensuring privacy in digital communications. Despite this acknowledgment, the law posits that encrypted services could inadvertently facilitate the sharing of CSAM, thus necessitating scanning. The proposed solution? Scan messages before they are encrypted. This means that services like Signal, WhatsApp, and Messenger would need to scan the content of messages before applying encryption. Here is Signal CEO’s statement about the law: