Apple Defends New Child Abuse Detection Technology

August 10, 2021

Apple has defended its new technology, which searches user’s phones for child sexual abuse material (CSAM) after it was heavily criticized by customers and privacy campaigners because the technology could be a “backdoor” to spy on people.

Some digital privacy advocates warned last week that authoritarian governments could use the technology to support anti-LGBT regimes or crackdown on political dissidents in countries where protests are considered illegal.

Apple said that the company will not “expand” the system, but said it had already put in place various security measures to ensure that the technology would not be used to address issues other than the detection of child abuse images.

Apple went on to explain that it will only scan photos shared on iCloud and that its anti-CSAM tool will prevent the company from seeing or even scanning a user’s photo album.

Since the system relies on a database of hashes of known CSAM images provided by child protection organizations, Apple explained that it is almost impossible to falsely flag innocent people to police, as there is also a human verification of positive matches that the system identifies.

For more information, read the original story on the BBC.

Top Stories

Related Articles

December 30, 2025 A fast-moving cyberattack has compromised more than 59,000 internet-facing Next.js servers in less than two days after more...

December 29, 2025 The U.S. National Institute of Standards and Technology (NIST) has warned that several of its Internet Time more...

December 29, 2025 A critical security flaw has been found in LangChain, one of the most widely used frameworks for more...

December 23, 2025 South Korea will require facial recognition scans to open new mobile phone accounts. The new rule is more...

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn