The 11-day Countdown
will the EU turn a blind eye to online child abuse?
In a quest to keep children safe online, for some time platforms operating across or within the 27 EU Member States have been able, voluntarily, to deploy technologies which performed one or more of three critical functions.
Note that word: voluntarily.
The first of these technologies emerged in 2009. Microsoft worked with Professor Hany Farid, then at Dartmouth College, now at Berkeley. Together they developed and made PhotoDNA available free of any licensing costs.
PhotoDNA is able to identify copies of images of child sexual abuse material (CSAM) which had previously been examined and classified as such by recognised authorities. Using a database of what, for all practical purposes, are unbreakable hashes of these images PhotoDNA worked at scale. At lightning speed digital files thus identified are removed from servers and relevant data are shared with law enforcement.
In the world of online child protection it is impossible to overstate the importance and significance of the emergence of PhotoDNA. A huge proportion of CSAM circulating on, or being exchanged over, the internet was and remains copies of images which have been seen before. Google estimated around 90% of the CSAM reports it made were of material previously confirmed as CSAM.
As victims in groups such as the Phoenix 11 have repeatedly made clear they want copies of their pain and humiliation found and gone in the shortest possible time. PhotoDNA can help deliver that. The problem is PhotoDNA has not been deployed widely enough.
Think about that. You know or ought to know you can detect and delete CSAM by using PhotoDNA. Why would anyone choose not to?
Second, tools now exist which can identify materials which are not in a database of previously known materials but nevertheless are highly likey to be CSAM. Once located these can be rapidly escalated for human review and dealt with accordingly.
Third, tools can detect grooming patterns—behavioural signals indicating a child is being targeted for sexual abuse. These allow for interventions that can bring the targeting to an end.
The initial glitch
In December 2020, a legal problem arose in the EU - and only in the EU - when the European Electronic Communications Code took effect, apparently bringing certain communications services within the scope of the EU’s e-Privacy regime (adopted in 2002 and amended in 2009).
It was suggested the Code created a risk platforms which had been voluntarily detecting actual or likely CSAM and grooming signals could no longer lawfully continue doing so.
Did the main movers intend to outlaw the three categories of child protection measures? Was it all deliberate and foreseen?
Or was such an outcome accidental, unintended and unforeseen, simply a case of bureaucratic incompetence? Different Directorates within the Commission failed to join the dots in time, with neither the Parliament nor the Council spotting and picking up on the implications?
I have no real way of knowing but, in my view, the latter is the more likely explanation. After all, the legal roots of the European Electronic Communications Code lay in a framework conceived in 2002, long before PhotoDNA or any of the modern child protection tools existed.
Either way, the result was a legal trap or uncertainty of the EU’s own making.
Tools which hitherto had been vigorously and extensively promoted by one part of the Commission as establishing a gold standard for online child protection were suddenly placed on questionable legal grounds by another.
In effect an undoubted general right to privacy of communication was being privileged over the privacy and other rights of specific child victims. And what is so infuriating about this is the false dichotomy on which it is premised. In no sense do the tools dilute or compromise anybody’s right to privacy in any material sense. Think anti-virus software and copyright protection tools as reasonable analogies.
After literally billions upon billions of uses over the years, there has not been a single reported case of any of the technologies I have referred to malfunctioning in a way that led to a wrongful arrest, much less to a conviction. For understandable and obvious reasons rigorous standards are applied by developers working in this space and by the platforms using the tools they produce.
Instead of confronting that reality directly and immediately, the EU institutions reached for a temporary derogation. This meant kicking the problem down the road, leaving the underlying conflict unresolved.
A draft Regulation was published which would resolve all the difficulties and the hope was the time provided by the derogation would allow all the necessary processes to be completed and an agreement reached. That hasn’t worked.
Since the draft Regulation was published in 2022 we have had nine Presidencies: France (for only a short period), the Czech Republic, Sweden, Spain, Belgium, Hungary, Poland, Denmark and now Cyprus. None were able to get the Regulation across the line.
Meanwhile the derogation at least allowed the measures to continue for the time being more or less as before but it also gave elements who were anyway always opposed to them time to marshall their forces.
Boy did they! Their scaremongering, deceitful tactics have been highly effective in clogging up the works and have brought us to the present point.
We are now on the edge of a precipice.
A political impasse
The EU’s law-making processes have hit a brick wall.
On 3rd April the measures may have to be dropped in all 27 EU countries.
While the Council of Ministers and the Commission have agreed to yet another extension of the derogation, pending the ever-hoped for final agreement on the Regulation, this time the European Parliament has declined to join with them. Without agreement between the three EU law-making elements—the “Trilogue”—the legal foundation for the voluntary measures described will expire. In eleven days.
The previous wobble
We have, in a sense, sort of been here before. In December 2020, when the alleged legal uncertainty first arose, Facebook (Meta) and (probably) some others suspended using the tools in the EU. Not elsewhere. Only in the EU. That interregnum lasted seven months.
During those seven months, for example, the number of reports of known CSAM from EU Member States fell by 58%. We will never know how much pain and suffering this hiatus caused, or to how many children.
If the entire sector is now forced to stop, the resulting overflow of undetected abuse will not remain within EU borders. It won’t only be children living in Europe who suffer, though they will. The impact will be global.
The UK’s legislative framework means nothing need change here but the reality is we too will feel the consequences.
Zeitgeist v rückwärtsgewandt
Around the world of late, more and more governments have been stepping up. Strengthening online child protection has become part of the global zeitgeist. Yet, sticking with the German language, rückwärtsgewandt, turning backwards, now seems a more appropriate way of describing what is happening at EU level.
In a campaign conceived and led initially by an elected German member of the European Parliament, the EU is on the edge of abandoning impactful child protection, and this despite overwhelming public support for stronger online child protection measures in every EU Member State, including or in particular in Germany.
“Realism? Pragmatism?”
When all this started back in 2020 various voices could be heard suggesting children’s groups should agree to withdraw their support for measures to detect grooming and measures to detect material likely to be CSAM. This would leave in place only measures to detect previously known CSAM and this would indicate children’s groups were being realistic and pragmatic, mature.
I don’t see it like that at all.
Absent any concrete evidence the tools have actually harmed anyone or anything why would any children’s group agree that they should be cancelled?
If politicians want to explain to their constituents why they have weakened online child protection they must do that without any cover or protection provided by groups charged with defending children.
It is still not too late for the EU to step back from the brink—but the window is closing fast. 11 days.


