Apple Will Soon Begin Scanning Your iPhone for Child Porn

(Promotional image courtesy of Apple.)
Apple scanning iCloud photo libraries — what happened to user privacy?

Apple just announced that it will begin scanning its customers’ iCloud Photo Libraries for known images of child sexual abuse.

Advertisement

While nobody less awful than Jeffrey Epstein wants to protect predators, more than a few are wondering what has happened to the world’s most valuable company’s commitment to user privacy.

The new technology will “limit the spread of Child Sexual Abuse Material (CSAM),” according to today’s press release. It’s all part of a broader move to “help protect children from predators who use communication tools to recruit and exploit them.”

Going forward, the company’s popular Messages texting app will use “on-device machine learning to analyze image attachments [sent to or from children] and determine if a photo is sexually explicit.” Apple says that the “feature is designed so that Apple does not get access to the messages.”

(Promotional image courtesy of Apple.)

Protecting kids is laudable, but there’s an Ick Factor to today’s announcement.

Recommended: [VIDEO] Hygiene Theater: AOC Puts Mask on for Selfie, Takes It Right Back Off

Before a photo is uploaded to a user’s iCloud Photo Library for sharing with other devices or other users, Apple will use on-device scanning that won’t upload or share any data with Apple or anyone else. The scanning tech, which will be built into the next versions of iOS and iPad OS, will compare your photos with a database of known CSAM material.

Advertisement

Here’s what happens next:

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.

There are nearly one billion active iCloud users. If each of them has as few as 1,000 photos in their library, there’s your one-in-a-trillion chance of somebody being wrongly accused of being in possession of CSAM — a criminal offense with a huge social stigma.

That 1,000 figure is extremely lowball. My own photo library is almost 28,000 pics, and I delete a lot.

Advertisement

The good news is that Apple employees won’t — can’t — browse through users’ photos. They can’t see anything at all, unless an image triggers the CSAM algorithm.

But as noted above, no algorithm is perfect. No human is perfect. And Apple will be putting privacy, reputations, and liberty at some small risk in the name of doing something that’s not their job.

Protecting kids is our job as parents, not Apple’s job as a purveyor of nice gadgets.

Recommended

Trending on PJ Media Videos

Join the conversation as a VIP Member

Advertisement
Advertisement