MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
apple
Recherche

Watchdog accuse Apple of ‘clearly underreporting’ child sexual abuse materials (CSAM)

lundi 22 juillet 2024, 20:32 , par Mac Daily News
The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) claimed Monday that Apple is failing to effectively monitor its platforms or scan for child sexual abuse materials (CSAM).
Katie McQue for The Guardian:


The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.
Through data gathered via freedom of information requests and shared exclusively with The Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.

MacDailyNews Take: Disingenuous poppycock.
Apple iMessage service is end-to-end encrypted. Apple cannot see data sent via its iMessage service. Apple’s Advanced Data Protection for iCloud allows users to protect important iCloud data, including iCloud Backup, Photos, Notes, and more. Apple cannot see data protected by Advanced Data Protection for iCloud.
Always be vigilant as they are likely to keep trying, probably again using the Think of the Children trojan horse.
In December 2022, after much opposition, including, voluminously, from us here at MacDailyNews, Apple killed an effort to design an iCloud photo scanning tool for detecting child sexual abuse material (CSAM) in the storage service.
As we wrote previously:
This sounds wonderful at first glance (everyone’s for detecting and rooting out purveyors of child pornography) and horrible once you think about it for more than a second (massive, awful potential for misuse)… It’s a huge can of worms. It’s a backdoor, plain and simple, and it neatly negates Apple’s voluminous claims of protecting users’ privacy. It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything. – MacDailyNews, August 6, 2021

Originally, Apple would use one database of hashes from the National Center for Missing and Exploited Children (NCMEC).

Then, after outcry, Apple changed their backdoor scanning to match “two or more child safety organizations operating in separate sovereign jurisdictions.”

Of course, Apple’s multi-country “safeguard” is no safeguard at all.

The Five Eyes (FVEY) is an intelligence alliance comprising the United States, Australia, Canada, New Zealand, and the United Kingdom. These countries are parties to the multilateral UKUSA Agreement, a treaty for joint cooperation in signals intelligence.

The FVEY further expanded their surveillance capabilities during the course of the “war on terror,” with much emphasis placed on monitoring the World Wide Web. The former NSA contractor Edward Snowden described the Five Eyes as a “supra-national intelligence organization that does not answer to the known laws of its own countries.”

Documents leaked by Snowden in 2013 revealed that the FVEY has been spying on one another’s citizens and sharing the collected information with each other in order to circumvent restrictive domestic regulations on surveillance of citizens.

Apple’s claim to backdoor scan only for CSAM was intended to be a trojan horse, introduced via the hackneyed “Think of the Children” ruse, that would be bastardized in secret for all sorts of surveillance under the guise of “safety” in the future.

“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” — Benjamin Franklin

The fact that Apple ever considered this travesty in the first place, much less announced and tried to implement it in the fashion they did, has damaged the company’s reputation for protecting user privacy immensely; perhaps irreparably.
Hopefully, if Apple management has any sense whatsoever, is not hopelessly compromised, and can resist whatever pressure forced them into this ill-considered abject disloyalty to customers who value their privacy and security, the company will end this disastrous scheme promptly and double-down on privacy by finally and immediately enabling end-to-end encryption of iCloud backups as a company which claims to be a champion of privacy would have done many years ago. – MacDailyNews, December 23, 2021

Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post Watchdog accuse Apple of ‘clearly underreporting’ child sexual abuse materials (CSAM) appeared first on MacDailyNews.
https://macdailynews.com/2024/07/22/watchdog-accuse-apple-of-clearly-underreporting-child-sexual-abu...

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
ven. 8 nov. - 04:06 CET