Tech major Apple has announced that it will be rolling out a feature that would police photos on users’ iPhones in an effort to detect and stop child abuse imagery from being circulated. Apple has said that its system would screen photos for child abuse images and that this would happen before users upload them from their iPhones in the US to its iCloud storage. While it has received praise from various child-protection organisations, Apple has been slammed by others who see some sinister consequences of looking into users’ mobile phones. They fear it would provide governments, and malicious actors, a chance to misuse the system. It all begs the question — will other tech majors be interested in doing something similar?
Messenger app WhatsApp, which is facing pressure from many governments to decrypt and reveal users’ data has reacted with fury. The move has specifically been targetted by WhatsApp Head Will Cathcart. He is among many others who say any attempt by this Apple photo check feature on iPhones to look into images could lead to governments asking for a similar feature to check terrorism and much more.
Cathcart’s aggressive stance is reflected in his tweets and he has revealed what WhatsApp will do. He tweeted “I read the information Apple put out yesterday and I’m concerned. I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no.”
In fact, Cathcart highlighted Apple’s own earlier stance on the issue of privacy. He tweeted (excerpts from a ‘letter to customers’ by Apple CEO Tim Cook uploaded in 2016), “Apple once said, “We believe it would be in the best interest of everyone to step back and consider the implications …” Cathcart added further from the letter, “…it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.” Those words were wise then, and worth heeding here now.”
Considering the sensitivity of the issue, he also tweeted, “Child sexual abuse material and the abusers who traffic in it are repugnant, and everyone wants to see those abusers caught.”
To this particular accusation, Apple said its so-called “safety vouchers” in the system will protect the company from government pressure and that no other material, except child abuse images, will be identified. Apple also highlighted the importance of its human review system in ensuring no other information is passed by the child abuse photo checking system to governments.
However, Reuters reports that many security experts have said that Apple was making a big mistake by checking people’s phones. “It may have deflected US regulators’ attention for this one topic, but it will attract regulators internationally to do the same thing with terrorist and extremist content,” said Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory.
(c) 2021 the Hindustan Times
Distributed by Tribune Content Agency, LLC.