Table of contents
The European Union is launching an innovative regulation that aims to prevent and combat online child sexual abuse, setting new rules for digital platforms and online service providers. The proposal is called “Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse (CSAM),” and is known as “Chat Control.”
At the heart of the proposal is the delicate balance between the need to protect minors from serious abuse such as paedophilia and child pornography, and the protection of users’ fundamental rights, in particular privacy.
One of the most debated aspects concerns whether platforms should be required to carry out preventive and constant scanning of content, even in services that use end-to-end encryption, in order to identify child pornography or grooming activities.
Risk assessment, an obligation for lenders
According to the EU Proposal, every online service provider, such as social media platforms, messaging services or app stores, is required to carry out a risk assessment for each service offered, in order to understand “the risk of its use for the purpose of child sexual abuse online” (Art. 3, para. 1).
This assessment requires analysing “previously identified cases of misuse”, the platform features that may facilitate such misuse, the characteristics of users, especially minors, and the impact of the business model (Art. 3, para. 2). The aim is to enable the provider to identify and mitigate these risks through proportionate and appropriate measures (Art. 4).
Scanning is not automatic, but a mitigation tool
The Proposal clarifies that scanning content is not a universal and automatic obligation imposed regardless. Rather, scanning is one of the tools that the provider can adopt to mitigate the risk identified in the assessment (Art. 4). In other words, data scanning is a choice that falls within the scope of reasonable and proportionate mitigation measures, which are necessary but not mandatory a priori.
As stated in the text: “Providers are free to define and implement, in accordance with Union law, measures based on practices adopted to detect cases of online child sexual abuse in their services” (Art. 4, para. 2), taking into account their technical and financial capabilities.
When scanning becomes mandatory
However, the Proposal provides that, if even after the assessment and measures taken by the provider there remains a significant risk that the service could be used for child abuse, the designated national authorities may intervene.
In such cases, the authorities “may request the independent judicial or administrative authority to issue an order requiring the service provider to carry out mandatory detection of cases of abuse” (Art. 7, para. 1).
The order is only issued after an “objective, diligent and case-by-case assessment” that balances the likelihood and severity of the abuse with the possible negative consequences for users’ fundamental rights (Art. 7, para. 4).
The text also emphasises that “the disclosure order may only be issued if the reasons for issuing it outweigh the negative consequences for the legitimate rights and interests of all parties concerned” (Art. 7, para. 4).
Guarantees and privacy protection
Given the sensitivity of automatic scanning, especially in private and encrypted communications, the regulation provides for a series of safeguards, such as limiting the duration of orders, human oversight of detection technologies, minimising the data processed, and access to redress mechanisms for users and providers (Art. 10).
The EU Centre for the Prevention and Combating of Child Sexual Abuse, another new feature of the proposal, will play a supporting role for authorities and platforms by providing databases of reliable indicators and appropriate detection technologies, helping to reduce false positives and invasive impacts (Art. 11).
The legislative process, developments following the controversy
Stage 1. The European Commission
Therefore, although the Commission’s proposal sent in 2022 does not explicitly state ‘you will have a scanner or algorithm on your phone’, it introduces ‘detection orders‘ that must be complied with regardless of the technology used, even if the service is end-to-end encrypted; the choice of ‘how’ is left to the provider. And precisely for E2EE services, the only practical way would be client-side scanning (before encryption).
Stage 2. Parliament
In 2023, the European Parliament position (LIBE mandate 2023, EP summary) excluded both end-to-end encryption and text messages from the scope of the orders, and limited the orders to cases of reasonable suspicion, thus ruling out indiscriminate scanning. Only if suppliers fail to comply with the measures proposed by the new Regulation – the principle of safety by design (developing products or services in such a way as to avoid potential harm), mandatory parental controls, the establishment of user reporting mechanisms and the use of age verification systems in cases of risk of grooming of minors – then the judicial authority could issue a detection order, but only as a last resort. Such an order would oblige the provider to use certain technologies to detect known and new child sexual abuse material.
Surveillance orders would only be used in cases where there is reasonable suspicion that individual users or groups are connected to child pornography. The orders would be limited in time, However, end-to-end encrypted communications and text messages are excluded from their scope of application. This approach aims to ensure the protection of the privacy and security of users of digital services.
Stage 3. The Council
During the legislative process, the Council Draft was presented on 1 July 2025, after a dozen consultations and amendments: it clearly states that, for E2EE services, detection takes place ‘before the transmission of content’ (i.e. pre-encryption), with a ‘user consent’ clause.
Stage 4. Trilogues between the Council and the EP
To become law, the fourth stage is still pending: the opening of trilogues (informal meetings between the EP and the Council, with the Commission acting as facilitator) to hammer out a compromise text acceptable to both parties. Indeed, the Council and the EP are co-legislators. The final text is created when the European Parliament and the Council vote and approve the exact same text under the ordinary legislative procedure. If there is no agreement, the law is not adopted.
Current situation
So it is not law and there is not (yet) an obligation that applies to everyone. If the Council’s current line were to pass, for E2EE services (e.g. WhatsApp/Signal/Telegram) it would mean the installation of software that checks content before encryption; if the Parliament’s line were to prevail, no scanning would take place. Everything will become clearer on 14 October, when ministers may attempt to adopt the Council’s position (‘general approach’). This is where the ministers’ real ‘political vote’ would take place, if the dossier is ready.
Today, however, a technical meeting of the Council (LEWP – Law Enforcement Working Party) is taking place to examine the Presidency’s compromise text. This is not a ‘formal adoption’: it is a preparatory meeting where Member States indicate their positions and attempt to finalise a text to be presented to ministers.
Therefore, following the Council’s position in October, trilogues with Parliament will be needed to agree on a common text and then formal adoption.
In all this, it should always be remembered that Article 15 of the Italian Constitution that protects the freedom and confidentiality of (private) correspondence and all other forms of communication, with possible restrictions only by reasoned order of the judicial authority and with the guarantees provided by law. (photo by Adem AY on Unsplash)
ALL RIGHTS RESERVED ©