MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Recherche

Child safety watchdog accuses Apple of hiding real CSAM figures

lundi 22 juillet 2024, 11:58 , par AppleInsider
A child protection organization says it has found more cases of abuse images on Apple platforms in the UK than Apple has reported globally.Apple cancelled its major CSAM proposals but introduced features such as automatic blocking of nudity sent to childrenIn 2022, Apple abandoned its plans for Child Sexual Abuse Material (CSAM) detection, following allegations that it would ultimately be used for surveillance of all users. The company switched to a set of features it calls Communication Safety, which is what blurs nude photos sent to children.According to The Guardian newspaper, the UK's National Society for the Prevention of Cruelty to Children (NSPCC) says Apple is vastly undercounting incidents of CSAM in services such as iCloud, FaceTime and iMessage. All US technology firms are required to report detected cases of CSAM to the National Center for Missing & Exploited Children (NCMEC), and in 2023, Apple made 267 reports. Continue Reading on AppleInsider | Discuss on our Forums
https://appleinsider.com/articles/24/07/22/child-safety-watchdog-accuses-apple-of-hiding-real-csam-f...

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
jeu. 21 nov. - 20:15 CET