Apple To Only Seek Abuse Photos Flagged In Many Countries

August 16, 2021

Apple announced on Friday that it would only look for images of child sexual abuse identified by clearinghouses in different countries.

Apple also dismissed criticism that the new system could be used to target individuals, stating that researchers are making sure the list of image identifiers on an iPhone matches the lists on all other phones.

On Friday, Apple executives noted that in resolving questions about how many matched images on a phone or computer it will take before the operating system alerts the company for a human review it could begin with 30 images, which could then be lowered over time with an improvement in the functioning of the system.

Apple acknowledged that it had mishandled communications around the feature, but declined to comment on the possibility that the criticism might have altered any of its policies or software.

For more information, read the original story in Reuters.

Top Stories

Related Articles

March 3, 2026 OpenAI CEO Sam Altman admitted on Monday that the company “shouldn’t have rushed” its new agreement with more...

March 3, 2026 U.S. uninstalls of ChatGPT’s mobile app surged 295 per cent day over day on Feb. 28 after more...

March 2, 2026 Bell Canada and Telus Corp. have withdrawn competing complaints before the CRTC over fibre network access, ending more...

February 27, 2026 eBay is cutting roughly 800 jobs or about six per cent of its workforce, as the company more...

Picture of TND News Desk

TND News Desk

Staff writer for Tech Newsday.
Picture of TND News Desk

TND News Desk

Staff writer for Tech Newsday.

Jim Love

Jim is an author and podcast host with over 40 years in technology.

Share:
Facebook
Twitter
LinkedIn