Apple To Only Seek Abuse Photos Flagged In Many Countries

August 16, 2021

Apple announced on Friday that it would only look for images of child sexual abuse identified by clearinghouses in different countries.

Apple also dismissed criticism that the new system could be used to target individuals, stating that researchers are making sure the list of image identifiers on an iPhone matches the lists on all other phones.

On Friday, Apple executives noted that in resolving questions about how many matched images on a phone or computer it will take before the operating system alerts the company for a human review it could begin with 30 images, which could then be lowered over time with an improvement in the functioning of the system.

Apple acknowledged that it had mishandled communications around the feature, but declined to comment on the possibility that the criticism might have altered any of its policies or software.

For more information, read the original story in Reuters.

Top Stories

Related Articles

February 6, 2026 The competition between OpenAI and Anthropic intensified this week after both companies unveiled new artificial intelligence models more...

February 5, 2026 French authorities raided X’s Paris offices on Tuesday as part of a criminal investigation tied to the more...

February 5, 2026 TELUS is opening Canada’s first fully sovereign AI factory to startups and small businesses. The telecom giant more...

February 4, 2026 Global markets were jolted on Feb. 3 as fears that artificial intelligence could upend the software industry more...

Picture of TND News Desk

TND News Desk

Staff writer for Tech Newsday.
Picture of TND News Desk

TND News Desk

Staff writer for Tech Newsday.

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn