The fight against child sexual abuse material (CSAM) is an absolute priority and a moral obligation of the digital society.
There is full consensus on this point. However, a supposedly simple solution has emerged in the public debate about the methods of this fight – mass scanning of our private communications, known as ‘Chat Control’.
It’s like trying to perform precision surgery with a sledgehammer. The tool is powerful, but it will almost certainly destroy more than it will repair, undermining the foundations of online privacy and security.
Criticism of this idea does not mean helplessness. On the contrary, it forces us to look for smarter solutions.
Fortunately, there is a whole arsenal of more effective and less invasive methods. It’s time to stop discussing the flawed tool and focus on the ones that really work.
Back to the foundations – working at the grassroots, not a technological illusion
Before reaching for a technological panacea, we need to strengthen the human pillar of the fight against cybercrime. Instead of drowning investigators in a sea of false alarms that an automated system would generate, we should invest in people.
This means bigger budgets for cybercrime departments , state-of-the-art digital forensics training and the hiring of civilian data analysts and security experts. It is they, not an algorithm, who are able to understand the context and separate the real threat from the digital noise.
Criminals, although they try to hide, leave digital footprints. The key is targeted operational work and online intelligence (OSINT).
Experienced investigators are able to infiltrate closed criminal groups, analyse publicly available data and follow leads within existing legal procedures. This is precise, painstaking work that produces real results, unlike mass scanning, which treats every citizen as a potential suspect.
As CSAM networks operate globally, the response must also be global. Instead of building a flawed system within the EU, the role of Europol and Interpol should be strengthened by simplifying the legal procedures for the exchange of digital evidence between countries.
An ecosystem of responsibility – the role of platforms and society
Fighting CSAM is everyone’s job, and online platforms can do much more without resorting to breaking end-to-end encryption. They need to improve reporting mechanisms and proactively moderate content in public channels.
Investing in teams of moderators who review user submissions (the so-called human-in-the-loop model) allows us to respond quickly to threats where they are most likely to occur.
However, the best fight is the one that prevents the problem from arising. Education and prevention play a key role here. Social campaigns that teach children about dangers such as grooming, as well as training for parents and teachers on how to recognise worrying signals, form the first and most important line of defence.
NGOs such as Poland’s Dyżurnet.pl or the global NCMEC (National Center for Missing & Exploited Children) should not be forgotten either. They are the ones on the front line, creating databases of known CSAM material and cooperating with law enforcement agencies around the world.
Supporting them financially and technologically is one of the most effective investments in child safety.
Technology for good – smart tools instead of mass control
Opposition to Chat Control is not opposition to technology. It is an objection to its mindless use. Indeed, there are smart tools that help fight CSAM without destroying fundamental civil rights.
Hashing technology such as PhotoDNA has been used successfully for years. It allows the creation of unique digital ‘fingerprints’ (hashes) of illegal photos and videos. Scanning for these hashes on servers, such as public clouds or social media platforms, allows the identification of already known material without compromising the privacy of encrypted communications.
Sometimes you don’t need to read a letter to know that something suspicious is going on. In justified cases, and always on the basis of a court order, law enforcement can analyse metadata – that is, data about the communication, not its content.
Information about who, with whom, when and how often they contact each other can tip the trail of a criminal network without breaking the secrecy of correspondence. It is the court order that is the fuse here, which mass scanning does not have.
The Chat Control proposal is a tempting but dangerous illusion of a simple solution to a complex problem. The real battle against CSAM resembles not a single battle, but a complex, multi-level campaign.
Instead of one flawed hammer, we need a whole Swiss pocketknife of tools: the precision of police investigation, the power of international cooperation, the wisdom of education and the intelligent support of technology that respects our rights.