MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Recherche

Apple Kills Its Plan To Scan Your Photos for CSAM

mercredi 7 décembre 2022, 21:01 , par Slashdot
Apple plans to expand its Communication Safety features, which aim to disrupt the sharing of child sexual abuse material at the source. From a report: In August 2021, Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At the beginning of September 2021, Apple said it would pause the rollout of the feature to 'collect input and make improvements before releasing these critically important child safety features.' In other words, a launch was still coming. Now the company says that in response to the feedback and guidance it received, the CSAM-detection tool for iCloud photos is dead.

Instead, Apple told WIRED this week, it is focusing its anti-CSAM efforts and investments on its 'Communication Safety' features, which the company initially announced in August 2021 and launched last December. Parents and caregivers can opt into the protections through family iCloud accounts. The features work in Siri, Apple's Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Additionally, the core of the protection is Communication Safety for Messages, which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM.

Read more of this story at Slashdot.
https://apple.slashdot.org/story/22/12/07/1936228/apple-kills-its-plan-to-scan-your-photos-for-csam?...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
mer. 24 avril - 20:23 CEST