A Positive Step for Child Protection and Privacy shield arrow-simple-alt-top arrow-simple-alt-left arrow-simple-alt-right arrow-simple-alt-bottom facebook instagram linkedin medium pinterest rss search-alt twitter x video-play arrow-long-right arrow-long-left arrow-long-top arrow-long-bottom arrow-simple-right arrow-simple-left arrow-simple-bottom readio arrow-simple-top speaker-down plus minus cloud hb pin camera globe cart rotate star edit arrow-top arrow-right arrow-left arrow-bottom check search close square speaker-up speaker-mute return play pause love

Help bring hope, rescue and restoration this holiday season.

GIVE TO OUR HOLIDAY MATCH

A Positive Step for Child Protection and Privacy

Apple’s recently announced child safety measures have been met with a cacophony of responses, both critical and supportive, from a variety of sources. International Justice Mission (IJM), through its Center to End Online Sexual Exploitation of Children, applauds Apple for its new child safety initiatives related to iCloud Photos and Messages specifically.

While Apple’s forthcoming initiatives are imperfect—and significant room exists across the tech sector to improve detection, disruption, and reporting of child sexual abuse—Apple’s move is a positive step forward.

For that reason, it should not be delayed.

It is an issue of privacy, but whose?

Strong opposition to Apple’s announcement is primarily presented under the banner of “privacy.” Common objections describe a slippery slope toward government abuses and mass surveillance. To be clear, the child safety solutions proposed by Apple have not been corrupted for such dire ends.

Critics fear a hypothetical future risk while apparently dismissing a very real, current, and widespread harm: Untold numbers of vulnerable children have been and are being abused, exploited, and otherwise victimized by the continued production, possession, and distribution of such images.

The current conversation risks elevating the hypothetical corruption of child safety solutions over the known and rampant misuse of existing technology to harm children.

As a survivor-centered organization, IJM deeply respects what survivors of child sexual abuse tell us and others in this space: children are entitled to have every image memorializing of the most painful and dehumanizing moments of their lives detected, reported, and removed from illicit circulation.

In contrast, offenders have no legal or privacy right to illegally create, possess, or share child sexual exploitation material. In fact, these acts undeniably violate the privacy of victimized children.

The survivors' voice is missing from the table

Unlike the hypothetical harm critics fear, the global crisis of child sexual abuse and exploitation—which Apple and others seek to counter through various safety initiatives and tech innovations—is all too real. In the face of privacy arguments against these safety measures, the Phoenix 11, a group of child sexual exploitation survivors, have rightly identified that this advocacy in the name of privacy is incomplete:

“What about our right to privacy? … It is our privacy that is violated each time an image of our child sexual abuse is accessed, possessed or shared.”

While others have provided more technical reviews of Apple’s plans, the voices of survivors have not been sufficiently prominent. We commend the Phoenix 11’s courageous advocacy for themselves and others.

IJM has also seen firsthand the harm and trauma children experience when sexual abuse and exploitation go undetected and unreported. We’ve also seen the very good reality of protection and hope when that abuse is uncovered and those victims are identified.

In the Philippines, IJM has supported law enforcement since 2011 in safeguarding over 850 people from livestreamed sexual abuse and exploitation by in-person traffickers whom online sex offenders pay for new abusive content.

Among those protected are children like Joy, Ruby, Cassie, Chang, and Marj.

Ruby*, now a survivor leader as an adult, shares the trauma she endured as a child:

“I felt disgusted by every action I was forced to do just to satisfy customers online. I lost my self-esteem and I felt very weak. I became so desperate to escape, to the point that I would shout whenever I heard a police siren go by, hoping somebody would hear me.”

Marj* was first exploited at the age of 13 by her friend’s older sister:

“I was confused because I was just a child. I was shaking. Then, I felt different. I felt ashamed. But I also had nowhere else to go.”

The act of forcing her to take explicit pictures was painful enough, but as Marj shared with IJM:

“…that abuse, I did not expect that it would spread. That it would be sent to other people.”

Take it from the Phoenix 11, Ruby, Marj, and others: Survivors are harmed first by the abuse they suffered, and then repeatedly through the violation of their privacy by offenders who possess and share images depicting their sexual exploitation. Apple’s new safety features are a step toward protecting the privacy of survivors while reasonably respecting the privacy of its users.

It’s not just about Apple

This is not about a single company. Improving the tech industry’s detection, disruption, and reporting of child sexual abuse is critical to protecting victims and survivors from ongoing harm. Innovations like on-device solutions hold significant promise precisely because of the potential to balance user privacy with child protection.

Fortunately, child safety leaders across the tech sector have expressed their commitment. “We are resolved to drive forward the improvements in technology and systems that will ultimately eradicate the online sexual abuse and exploitation of children on our platforms,” said the Technology Coalition’s Executive Director in its first-ever annual report.

Recent child safety announcements by Technology Coalition members Apple, TikTok and Google are steps in the right direction, with much more to be done. Yet the current backlash against these efforts could discourage the development and adoption of additional real-world protections for children.

That would be a missed opportunity for both child protection and true privacy.

*pseudonym

You might also be interested in…

see more

Media Contact

We're here to answer your questions. Please fill out the form below and someone from our team will follow up with you soon.

More Information

Petra Kooman

Director of Marketing and Public Relations
pkooman@ijm.ca
519.679.5030 x.229

Make an Impact

Your skills, talents, and ideas are a force for change. From birthday parties to polar dips, your fundraising campaign can stop the violence.

Learn More

Thank you for signing up to learn more about starting a fundraiser. We will be in touch soon!

In the meantime, please take a look at our free guide: 25 Tips for the Novice Fundraiser.

Need Help?

Need more information?
We're here to help.
Contact us at events@ijm.ca

Test

Test