Statement on behalf of the Phoenix 11: Survivors’ voices must be part of debate over Apple’s plan to tackle child sexual abuse imagery
For Immediate Release
We, the Phoenix 11, are among the countless children whose pain and trauma is captured in the millions of child sexual abuse images that, in this moment, circulate freely on the internet.
Much has been said in recent headlines about Apple’s plan to detect when users upload child sexual imagery onto their servers. Sadly, the loudest voices have attacked the move, preferring to focus on hypothetical concerns related to users’ privacy.
What we have not heard are the voices of survivors, and about what has actually happened to survivors like us because our abuse images are allowed to spread online.
It should go without saying, but we need to point out that every one of our child sexual abuse images was created illegally. It is our privacy that is violated each time an image of our child sexual abuse is accessed, possessed or shared.
This is not hypothetical for us. It is hard to describe what it feels like to know that at any moment, anywhere, someone will be looking at our images, of us as children being sexually abused and getting sick gratification from it. It’s like we are abused over and over and over again.
You haven’t heard survivors' voices in this public debate because we have to be diligent about protecting our identities, because no one has protected us. We have been stalked, doxed, and hunted online. Every day we fear someone who has seen our abuse material will recognize us.
What about our right to privacy?
We are supportive of these initial steps Apple has introduced. Proactively detecting child sexual abuse material is already happening on many online platforms. However, the bar has been set extremely low.
Apple has stated that their system will only raise a flag internally and they won’t be able to access these matches to known child sexual abuse material until 30 or more of these images have been uploaded to an iCloud users account. Only then are the images manually reviewed by Apple, then by the National Center for Missing and Exploited Children, and then potentially forwarded to police. What other crime do you have to commit 30 times before being reported to law enforcement?
We are calling on Apple to reconsider this threshold. One image is too many.
The tools used to scan for child sexual abuse images give us survivors such profound hope that one day we may be out of the spotlight. Not only is the re-sharing of the content harmful to the victims in it, it is also used to groom and normalize the abuse of the next generation of victims. We don’t want any more children to have to deal with what we deal with when it can be fixed.
It is our collective obligation to do our part and hold industries accountable to take down these images and videos. It shouldn’t be that complicated.
Media relations contact:1 (204) 560-0723
communications@protectchildren.ca