Home » News » Apple removes the contentious CSAM iCloud photo scanning function

Apple removes the contentious CSAM iCloud photo scanning function

(Image Credit Google)
Apple has stopped developing its iCloud photo scanning tool for CSAM (child sexual abuse material). The iPhone manufacturer acknowledged that it canceled its plans to implement such a security measure, which greatly enraged the renowned tech giant. Consequently, the detection technique would no longer be used. Additionally, CSAM detection on iCloud Photos is one of the suggested new safety improvements. It should search through iCloud users' photos for any potential child abuse-related pictures. The iPhone manufacturer intended to make it available with iOS 15 and iPad OS 15. CSAM iCloud photo scanning function It's important to note that the function drew harsh criticism even before Apple released it. Security experts and even some tech company workers issued warnings against it. Also Read: Apple iOS 16.1.1 Rolls Out, Available To Download: Update The digital juggernaut with headquarters in Cupertino delayed its launch in response to user complaints. Prior to the end of 2021, the company first intends to offer the detecting feature. Given that, the tech giant remained silent about it until today, almost a year later.

By Monica Green

I am specialised in latest tech and tech discoveries.

RELATED NEWS

Visualise pointing your phone at a complex board g...

news-extra-space

Google Assistant is getting a makeover, but not ev...

news-extra-space

Collaboration lies at the core of Google's suite o...

news-extra-space

An Amazon Alexa gadget can do lots of things for y...

news-extra-space

In the digital age, ZIP files are a commonplace wa...

news-extra-space

The , which is planned for October 4, 2023, is a i...

news-extra-space
2
3
4
5
6
7
8
9
10