Reply to the Initiative of the EU-Commission that wants to mandate all platforms to implement detection, removal and reporting of illegal content. For the children.
Mandatory detection and reporting by all communication providers of grooming or of new child abuse materials — as written in the proposed legislation — both require monitoring of the communication of children, followed by human checking of reports generated by this monitoring.
This means that to protect children from having their sensitive images taken away and sent to adults, the EU Commission would mandate sending sensitive images of children to adults.
That’s with option D.
With option E, even sensitive communication of children would be reported and sent to adults for checking.
The scandals about sexual abuse in the christian churches show the danger that even an approach built on semi-voluntary reporting via expected confession poses.
With the current state of technology, more than 80% of the reports generated by the best available technological tools are false. And false reports pose an inordinate risk for this material, because those filters must first detect the most private material and then select from this material likely illegal material.
Regardless of improvements of technology, this will always be the case. All grooming and all CSAM is sensitive material, but NOT all sensitive material is grooming or CSAM. Therefore most of what is reported will be sensitive material. And this sensitive material is highly compromising. For grooming and CSAM this compromising is good. But for the reported legal material, this compromising endangers society, our industrial competitiveness and our whole political system.
The assessment of impacts ignores the danger that foreign powers might be able to access these reports or that abusers might try to get employed in a to be established EU centre or in public authorities.
If a foreign power should get hold of sensitive images or communication of members of government or its institutions, this will enable them to compromise our political system.
If a competing foreign company gets access to such material about business leaders or about employees with access to critical information, this will increase the effectiveness of their industrial espionage, enabling them to get access to even the best protected secrets.
If criminals get access to such material about regular citizens, this will enable much more vicious extortion schemes.
If child abusers get access to such material, this would cause a massive increase of child exploitation material, and it would enable them to use much more efficient grooming, because they could pressure the children with the sensitive material and claim that they have much more, because the children cannot know whether they got it from a leak of the reported material or from direct access to the mobile phone.
To take wording from a current conflict: this is a kompromat-machine: this proposal, especially but not only with option D or E, would generate highly compromising material that would endanger all citizens of the EU, all of the EU's industry, all of EU politics up to the highest levels, and even the very children this proposed legislation claims to protect.
We cannot protect children from getting sexual exploitation images sent to adults by mandating that the most sensitive images and communication of children must be sent to adults.
That’s what I sent. What I did not send, because it would dilute the message I want to send legislators, is what can be done to increase protection of children against sexual exploitation.
The first measure is to create awareness campaigns with police to increase the chance that a report is taken seriously. How many women report that when they go to police because of online sexual harassment they are ignored or even laughed at. Now imagine a child going to police.
How likely is it that the police takes the appropriate action to first talk to the child that reports something, ask where the incident happened, maybe talk to the parents (if this is OK with the child!), send a well-informed officer to investigate the service the child reported (practically that means entering a game or chat without being recognized as police and watching for a while), and get in contact with the company running the service to check what is happening there?
The second measure is to increase the capacity of law enforcement. This requires (re-) building trust between law enforcement and those who know about safety online. Measures like the proposal of the EU commission cause many competent online-safety advocates to shudder at the thought of joining police.
The third measure is to focus on supporting victims and on finding and stopping the perpetrators of known abuse. Platforms should make it easy for children to report and get support when they do not feel safe, there should be social workers that can support moderators in online-services at spotting abuse and improving communication, and police should make it easy for service providers to get in contact if they get reports of abuse that may need legal action.
If a moderator in a game-chat notices potentially illegal behavior, the moderator should have a place to get support from social workers, and when those suggest escalating, the police should be attentive to such reports and take informed and competent measures to stop the problems.
And finally a smartphone or computer could be an actual ally to children and help them recognize grooming or other abusive communication. But for this, they must be able to trust it. This means that the smartphone must never send information without the childs knowledge. As a practical example, a smartphone could warn the child when it detects a picture that could be problematic, and ask it to talk to its parents. It should be a user-agent — working on behalf of its user and no one else.
And this is the very opposite of the proposed legislation that wants to mandate betraying children.
Several more useful measures that would help to protect children are suggested on chatcontrol.eu.