Apple is mum about its intentions to find known CSAM stored in iCloud Photos.

Neha Roy
0

 Since more than a year ago, Apple has been working on three new child safety features, including a system to identify images of known child sexual abuse material (CSAM) stored in iCloud Photos, a feature to blur sexually explicit images in the Messages app, and resources for Siri to help prevent child exploitation. The latter two features are currently accessible, but Apple hasn't said anything about how it will handle CSAM detection.



After initially promising to include CSAM detection in an update to iOS 15 and iPadOS 15 by the end of 2021, Apple finally decided to push back the feature in response to "input from users, advocacy groups, researchers, and others."


Apple updated their Child Safety website with the following information in September 2021:

We previously made plans for features aimed at preventing child predators from using communication technologies to lure and take advantage of youngsters as well as limiting the distribution of Child Sexual Abuse Material. We have made the decision to take more time over the upcoming months to gather feedback and make improvements after considering comments from users, advocacy groups, researchers, and others before launching these vital kid safety features.

Apple removed the aforementioned update and all other mentions of its CSAM detection intentions from its Child Safety page in December 2021, but a representative for the company told The Verge that the company's ambitions for the functionality had not changed. To the best of our knowledge, Apple has not since made any public remarks regarding the plans.


We've been in touch with Apple to find out if the feature is still in the works. A request for comment from Apple was not immediately complied with.

With the release of iOS 15.2 and other software updates in December 2021, Apple did go ahead and implement its child safety features for the Messages app and Siri. It also expanded the Messages app feature to Australia, Canada, New Zealand, and the UK with the release of iOS 15.5 and other software updates in May 2022.


Apple said that the user's privacy was considered when designing its CSAM detection technology. Child safety groups' databases of known CSAM picture hashes would be used by the system to perform "on-device matching," which Apple would then turn into a "unreadable set of hashes that is securely stored on users' devices."

Apple intended to notify the National Center for Missing and Exploited Children (NCMEC), a nonprofit organisation that collaborates with American law enforcement agencies, of any iCloud accounts with known CSAM image hashes. A "threshold" and a manual human assessment of flagged accounts, according to Apple, would guarantee "less than a one in one trillion probability per year" of an account being wrongly detected by the algorithm.

Many people and organisations, including security professionals, the Electronic Frontier Foundation (EFF), legislators, policy groups, academic researchers, and even some Apple workers, opposed Apple's plans.

Apple's child safety measures, according to some detractors, might introduce a "backdoor" into gadgets that governments or law enforcement agencies could use to spy on users. False positives were another issue, including the potential for someone to purposefully add CSAM imagery to another person's iCloud account in order to flag that account.


Please take note that this topic's discussion thread may be found in our Political News section because of its political or social aspect. Visitors to the website and forum members alike are invited to read and follow the thread, but only forum members who have made at least 100 posts are allowed to post.

Post a Comment

0Comments
Post a Comment (0)