Most of us don’t know or don’t want to acknowledge how vulnerable our children are to online predators.
Supported by the real-life experience of survivors, the energy of committed advocates, and extensive research the Canadian Centre has identified critical points of needed change for government and the tech industry—all for the greater protection of our children.
This Press Release is from the Canadian Centre for Child Protection
How we are Failing Children: Changing the Paradigm
How We Are Failing Children: Changing the Paradigm is an urgent call to action for governments, industry, and hotlines around the world. Current policies for the removal of child sexual abuse images have been focused on determining and removing material deemed illegal under criminal law. In contrast, this framework is grounded in the best interests of the child, and their right to dignity, privacy, and protection from harm. The undeniable truth is the rights of a victimized child will be continually violated as long as images of them being sexually harmed and abused are available on the internet.
Scope of the Problem
- In just under three years, the Canadian Centre’s Project Arachnid has detected over 13 million suspected images of child sexual abuse for analyst review and issued close to five million removal notices to industry.
- The National Center for Missing and Exploited Children’s (NCMEC’s) CyberTipline, the largest hotline of its kind in the world, averages approximately one million reports of child sexual exploitation each month and has received, in total, more than 45 million reports.
Issues with Current Responses
While there are many ways in which this epidemic is not being addressed appropriately, the framework has identified several key areas of concern:
- A rigid adherence to narrow criminal law definitions. Criminal definitions do not account for the wide range of harmful/abusive images that are available and are too restrictive when making decisions about image removal. This means a significant proportion of harmful/abusive images remain online.
- The varying levels of commitment to safeguarding children demonstrated by technology companies (e.g., some are swift to remove material once they are notified, and others enter into debates or ignore notices all together).
Framework for Action
Expand removal to include all harmful and abusive images of children, including:
- All images associated with the abusive incident. These images often do not meet criminal law definitions but are still part of the continuum of abuse.
- Nude or partially nude images of children that have been made publicly available (typically stolen from unsecured social media accounts or secretly taken images), AND are used in a sexualized context or connected to sexual commentary.
- Images/videos of children being physically abused, tortured, or restrained.
The full press release can be found on their website.