Apple removes the contentious CSAM iCloud photo scanning function
December 08, 2022 By Monica Green
(Image Credit Google)
Apple has stopped developing its iCloud photo scanning tool for CSAM (child sexual abuse material).
The iPhone manufacturer acknowledged that it canceled its plans to implement such a security measure, which greatly enraged the renowned tech giant. Consequently, the detection technique would no longer be used.
Additionally, CSAM detection on iCloud Photos is one of the suggested new safety improvements. It should search through iCloud users' photos for any potential child abuse-related pictures. The iPhone manufacturer intended to make it available with iOS 15 and iPad OS 15.
It's important to note that the function drew harsh criticism even before Apple released it. Security experts and even some tech company workers issued warnings against it.
Also Read: Apple iOS 16.1.1 Rolls Out, Available To Download: Update
The digital juggernaut with headquarters in Cupertino delayed its launch in response to user complaints. Prior to the end of 2021, the company first intends to offer the detecting feature.
Given that, the tech giant remained silent about it until today, almost a year later.