Navigation
Recherche
|
Apple Explains Pullback from CSAM Photo-Scanning
mercredi 6 septembre 2023, 17:48 , par TidBITS
In a letter responding to a child safety group, Apple has outlined its reasons for dropping its proposed scanning for child sexual abuse material in iCloud Photos. Instead, the company is focusing on its Communication Safety technology, which detects nudity in transferred images and videos.
https://tidbits.com/2023/09/06/apple-explains-pullback-from-csam-photo-scanning/
|
59 sources (15 en français)
Date Actuelle
lun. 29 avril - 02:05 CEST
|