Apple Defends New Child Abuse Detection Technology

August 10, 2021

Apple has defended its new technology, which searches user’s phones for child sexual abuse material (CSAM) after it was heavily criticized by customers and privacy campaigners because the technology could be a “backdoor” to spy on people.

Some digital privacy advocates warned last week that authoritarian governments could use the technology to support anti-LGBT regimes or crackdown on political dissidents in countries where protests are considered illegal.

Apple said that the company will not “expand” the system, but said it had already put in place various security measures to ensure that the technology would not be used to address issues other than the detection of child abuse images.

Apple went on to explain that it will only scan photos shared on iCloud and that its anti-CSAM tool will prevent the company from seeing or even scanning a user’s photo album.

Since the system relies on a database of hashes of known CSAM images provided by child protection organizations, Apple explained that it is almost impossible to falsely flag innocent people to police, as there is also a human verification of positive matches that the system identifies.

For more information, read the original story on the BBC.

Top Stories

Related Articles

November 13, 2025 CrowdStrike’s 2025 Global Threat Report paints a clear picture of a threat landscape moving faster, operating more more...

November 13, 2025 The Washington Post has confirmed that nearly 10,000 current and former employees and contractors had personal information more...

November 13, 2025 Criminals are using a new phishing scheme to target people who have lost their iPhones, sending messages more...

November 12, 2025 Cybersecurity experts are warning of a sharp increase in investment scams powered by artificial intelligence, as criminals more...

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn