Navigation
Join our brand new verified AMN Telegram channel and get important news uncensored!
  •  
A1F

Apple to permanently scan users’ photos and messages for abusive content

Woman holding a space gray iPhone X and a black pen over a pad. (Pexels/Released)
August 06, 2021

Apple announced on Thursday plans to scan users’ iPhones for child sexual abuse content in an effort to “protect children from predators who use communication tools to recruit and exploit them,” in addition to limiting the spread of Child Sexual Abuse Material (CSAM).

According to Apple’s announcement, new operating system technology will allow the company to identify known CSAM images, allowing Apple to report the incidents to the National Center for Missing and Exploited Children, an agency that collaborates with law enforcement to combat child sexual abuse nationwide.

“Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations,” the company said.

After initial detection, the image will be reviewed by a human, and if it is confirmed child pornography, the user’s account will be disabled and the appropriate authorities will be notified.

“This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.”

Users’ encrypted messages will also be scanned for sexual content as part of the broader child safety efforts, prompting some privacy advocates to voice concerns with the new measure.  

According to The Associated Press, Matthew Green, a cryptography researcher at Johns Hopkins University, warned the new technology and approach could be used to frame innocent users by sending them images that would trigger the detection system.

“Researchers have been able to do this pretty easily,” he said regarding tricking the system’s algorithms. Green also said the system could easily be used by governments to surveil dissidents or protesters.

“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

Apple said the new features will roll out later this year through updates to iOS 15, iPadOS 15, watchOS 8 and macOS Monterey.

“This program is ambitious, and protecting children is an important responsibility,” Apple wrote. “These efforts will evolve and expand over time.”