It has recently come to our attention that the European Union is considering adopting an e-Privacy Regulation that we understand will prevent internet platforms from scanning their networks for child sexual abuse material, and using tools like Microsoft PhotoDNA in the process. The possibility that this could occur is deeply concerning to us, and we believe it is incumbent upon us to explain why, not only for ourselves, but for all those who have been victimized by the creation and distribution of child sexual abuse imagery.
We want to remind everyone that when we were children, when we should have been able to enjoy being children and all that goes with it, we were being sexually abused. To make that horrifying experience even worse, images and videos were created, and for many of us, shared with an online audience. Images and videos over which we had no control, and that show us at our most vulnerable. Those images and videos continue to haunt us today as they relentlessly perpetuate on websites and in the inboxes and shared folders of an ever-growing community of offenders.
It should go without saying, but we feel it is important to point out that each and every one of our child sexual abuse images was created illegally and without consent. And anyone who has those images or videos now, has them without our consent. It is our privacy that is violated, each and every time an image of our child sexual abuse is accessed, possessed or shared, and it is our dignity, and our rights, that are at stake—along with the dignity and rights of all children who have been victimized in this way.
For many years, we felt as though little was being done to effectively stem the proliferation of our child sexual abuse material. Day after day, month after month, offenders from all corners of the world share and trade our imagery, use it as they wish to groom other children, talk about it, and use it to convince themselves that sexually abusing children is normal. In the last few years, after years of having no control, no hope, and no ability to stop people from imagery they have no right to have or distribute, we have become aware that change has finally started to happen. The very technology used to shame us into silence is now being used to help us. That cannot stop. We have suffered far too much, for far too long, to let that happen.
We want you to remember that each and every one of our child sexual abuse images represents a direct and continuing violation of our rights, our dignity, and our personal safety. Think of the suffering, the abuse that we had to endure, for our images and videos to even exist. Remember that the people who have and share these images have no right to have them, and have never had such right. If the choice is between protecting the privacy of the people taking pleasure from our pain, and protecting our privacy—and the privacy of all the children in child sexual abuse material—we want you to protect children.
We, the Phoenix 11, are asking the European Union to adjust the draft e-Privacy Regulation to ensure that it does not prevent industry from doing all that it can to eradicate child sexual abuse material from their networks. Just as the regulation is clear that scanning to identify malware and viruses is permitted, it should be equally clear that scanning to detect and address child sexual abuse material is acceptable. If the law recognizes and accounts for the need to protect machines from being harmed, surely it must also protect children from the real and substantial harm caused by the transmission of child sexual abuse material.
Please understand that we long for the day when we can be released from the chains that the imagery has placed on us and our future. Think of the profound social, economic and personal price that has been paid and will keep on being paid by tens of thousands of survivors who are just like us if the very entities who are in a position to safeguard our rights and our dignity believe they cannot do so for fear of running afoul of this law.