No one but the dregs of humanity is in favor of child porn, but Apple has apparently abandoned its plan to build “CSAM” scanning software into its mobile operating systems for iPhone and iPad.
“While nobody less awful than Jeffrey Epstein wants to protect predators, more than a few are wondering what has happened to the world’s most valuable company’s commitment to user privacy,” I wrote back in August when the Cupertino tech giant announced its privacy-busting feature. But a new report indicates that Apple has listened to customer pushback.
AppleInsider reported on Wednesday that the company has in the last few days “removed all signs of its CSAM initiative from the Child Safety webpage on its website.”
That’s a good indication that operating system-level photo scanning will not be coming to a future update to iOS 15 or iPadOS 15 as Apple had promised last summer.
The good parts of Apple’s Child Safety feature set will live on, including on-device detection of sexual materials on the phones of minor children. The data doesn’t go anywhere and isn’t seen by anyone. The detection system merely alerts kids how they can get help.
On the dangerous side, another part of Child Safety would scan the popular Photos app for known Child Sexual Abuse Material (CSAM).
The problem wasn’t with Apple’s intentions, which certainly seem noble, or with the privacy protections built into the CSAM scanner.
The problem was the execution, which could easily be abused by governments for domestic spying on political opponents.
Recommended: COVID Sense Breaks Out in Colorado, World Fails to End
Here’s how it was supposed to work:
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.
Known CSAM material has been given an invisible digital watermark so that authorities can trace underground trading in child porn materials.
But what if government officials started putting digital watermarks on perfectly legitimate materials — like the snapshots you took of you and your friends at a Trump rally or at the shooting range?
Those watermarks could be used to track uppity citizens opposed to the current administration or some future government.
Apple was correct in 2020 when the company refused to create for the FBI a backdoor into iOS, even though the agency wanted help in tracking the contacts of the Pensacola Shooter.
It was an unpopular stand with many, including President Donald Trump’s Attorney General William Barr.
But once created, Apple argued, the company would be helpless to say no if some FISA court issued a secret warrant to break into your phone, my phone, or even thousands of people’s phones all at once. Without any warning or anyone’s knowledge, the FBI could hoover up literally every bit of data on as many devices as they could get secret warrants for.
Given how bad federal cybersecurity is, it’s a given that in no time that backdoor would be open to criminals around the world, too.
And knowing what we know now about the FBI, they might not even bother getting warrants.
With CSAM, Apple was going to build a backdoor into the photo libraries of a billion users around the world — a backdoor that governments could and certainly would abuse.
If Apple really has decided to call it quits on CSAM scanning, good for them — and good for liberty here and abroad, too.
Join the conversation as a VIP Member