World

Child Safety, Chat Control and the Danger of Mass Surveillance

Oisín McAnenna
October 13, 2025
5 min

Image - Andrew Heald

On the 11th of May 2022 the European Commissioner for Home Affairs Ylva Johansson proposed The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR). The aim of the bill is to prevent and combat the sexual abuse of children online through a variety of measures, most notably the mandatory detection and reporting of CSAM (Child Sexual Abuse Material) by digital platforms.

This legislation has been colloquially referred to as Chat Control 2.0 as it follows along from a previous piece of legislation that allowed for voluntary monitoring from communications companies such as email service providers (Chat Control 1.0). The results of this monitoring have been mixed at best as the AI systems used to scan the gathered data are unreliable and have provided many false positives.

Per a 2023 report “most Member States reported receiving most or all of their reports from NCMEC”. The National Center for Missing and Exploited Children is an American non-profit organisation to whom US providers report their CSAM content. Given that most social media companies are US based this is where most data comes from. A report from the Irish Council for Civil Liberties found that just 852 out of 4192 (20.3%) of the reports received by the Irish police forces from NCMEC turned out to be actual exploitation material. There is no reason to believe that the EU would have a lower rate of false positives and since the US legislation does not apply to encrypted services (services through which people often consensually share sexual material) we could see even more.

The policy proposes the creation of an EU centre (hosted by Europol), a central database through which all reports will flow before being passed on to the relevant national authorities. As of writing there has been no indication as to how long data will be held by the EU centre apart from that it will comply with GDPR standards. The data retention standards in question are “no longer than is necessary for the purposes for which the personal data are processed” . Exceptions are made for the purposes of public interest leaving the storage policies of this proposed EU centre vague to say the least. Indeed, if we return to the case of the Irish police above, it was found that authorities had been holding onto all data regardless of if the individual concerned had been cleared. This report was published more than four years after GDPR legislation had been made effective.

Aside from false positives, the danger of this new mandatory detection policy, referred to by its critics as “Chat Control 2.0” is ‘mission creep’ . Mission creep refers to the gradual expansion of an organisation’s original goals or activities beyond their initial scope, often without formal approval, oversight, or a clear change in mandate.

We have seen this time and again where surveillance is concerned. Surveillance cameras now ubiquitous in UK cities, saw their initial expansion drive as a response to the Bishopsgate bombing by the IRA in 93’. Justified as necessary in order to combat terrorism CCTV is being used in 2025 to prosecute graffiti artists for their role in the spray painting of wheelie bins. Regardless of one’s personal opinion of this case, this is a far cry from the initial purpose of this network.

In the US too we have seen how this creep occurs. The massive surveillance permissions granted to law enforcement in the wake of 9/11 have been expanded again and again since their initial implementation. So called ‘Sneak-a-Peak’ warrants for example, were introduced under the ‘Patriot Act 2001’ allowing investigators to conduct searches on individuals without their knowledge. From 2001-2003 these powers were used 47 times; In 2024 alone 17,475 new warrants were granted. Less than 1% of these were issued for terrorism related charges.

The extension of monitoring systems to E2EE services is extremely concerning. Confidence in the security of these systems is vital. If services such as Signal can no longer be trusted it will have severe, chilling effects on speech. As an open letter from hundreds of scientists and cryptographers noted this risks “(undermining) the tools needed by whistleblowers, journalists and human right activists” . Once these providers build backdoors into their systems they can be opened by any future government that may rise to power.

This bill is also continuing what has become a common trend across the Western world. ‘Save the Children’ has become the standard call to action for all manner of draconian measures in recent years. We have just seen this in the UK with the implementation of the Online Safety Act which critics have also argued is a dangerous attack on free speech, requiring users of certain websites to undergo age verification either through face or id scans. This has resulted in Wikipedia threatening to pull out of the UK as the rules forced on them could put some of their contributors at risk. Given how we have seen the ‘Terrorism Act 2000’ wielded against Palestine Action it is easy to see the data gathered being used in ‘exceptional circumstances’ for purposes for which it was not initially endorsed.

Similar rhetoric is being used to justify laws targeting immigrants across the continent. In the UK frenzied rants regarding hoards of immigrant grooming gangs have become commonplace, a regular talking point of Reform and Tory leaders and supporters. It should be noted there is no evidence that immigrants are more likely to commit this type of crime than anyone else. In the EU we can see the threat of sexual violence being wielded in a similar manner by the likes of Viktor Orbán, Giorgia Meloni and Marine Le Pen.

The threat of sexual violence has become the axis upon which authoritarian politics turns, replacing to some extent the spectre of terrorism that justified similar measures in the years proceeding 9/11. Of course, protecting children is an admirable goal no reasonable person would oppose. However, this legislation will provide authorities with a surveillance scope that could permanently undermine the notion of private communication. Yet the price we are paying for the end of the free internet is to build a CSAM detection system that could charitably be described as unreliable. We should think very carefully before we construct this panopticon as after it is erected it will be very difficult for anyone to control who it chooses to observe.