MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
content
Recherche

Coroner Lists Instagram Algorithm As Contributing Cause of UK Teen's Death

samedi 1 octobre 2022, 00:40 , par Slashdot
An anonymous reader quotes a report from Ars Technica: In a London court this week, coroner Andrew Walker had the difficult task of assessing a question that child safety advocates have been asking for years: How responsible is social media for the content algorithms feed to minors? The case before Walker involved a 14-year-old named Molly Russell, who took her life in 2017 after she viewed thousands of posts on platforms like Instagram and Pinterest promoting self-harm. At one point during the inquest, Walker described the content that Russell liked or saved in the days ahead of her death as so disturbing, the coroner said in court, that he found it 'almost impossible to watch.' Today, Walker concluded that Russell's death couldn't be ruled a suicide, Bloomberg reports. Instead, he described her cause of death as 'an act of self-harm whilst suffering from depression and the negative effects of online content.'

Bloomberg reported that Walker came to this decision based on Russell's 'prolific' use of Instagram -- liking, sharing, or saving 16,300 posts in six months before her death -- and Pinterest -- 5,793 pins over the same amount of time -- combined with how the platforms catered content to contribute to Russell's depressive state. 'The platforms operated in such a way using algorithms as to result, in some circumstances, of binge periods of images, video clips and text,' which 'romanticized acts of self-harm' and 'sought to isolate and discourage discussion with those who may have been able to help,' Walker said.

Following Walker's ruling, Russell's family issued a statement provided to Ars, calling it a landmark decision and saying that the court didn't even review the most disturbing content that Molly encountered. 'This past fortnight has been particularly painful for our family,' the Russell family's statement reads. 'We're missing Molly more agonizingly than usual, but we hope that the scrutiny this case has received will help prevent similar deaths encouraged by the disturbing content that is still to this day available on social media platforms including those run by Meta.' Bloomberg reports that the family's lawyer, Oliver Sanders, has requested that Walker 'send instructions on how to prevent this happening again to Pinterest, Meta, the UK government, and the communications regulator.' In their statement, the family pushed UK regulators to quickly pass and enforce the UK Online Safety Bill, which The New York Times reported could institute 'new safeguards for younger users worldwide.' Meta and Pinterest took different approaches to defend their policies. 'Pinterest apologized, saying it didn't have the technology it currently has to more effectively moderate content that Molly was exposed to,' reports Ars. 'But Meta's head of health and well-being, Elizabeth Lagone, frustrated the family by telling the court that the content Molly viewed was considered 'safe' by Meta's standards.'

'We have heard a senior Meta executive describe this deadly stream of content the platform's algorithms pushed to Molly, as 'SAFE' and not contravening the platform's policies,' the Russell family wrote in their statement. 'If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive.' Russells' statement continued: 'For the first time today, tech platforms have been formally held responsible for the death of a child. In the future, we as a family hope that any other social media companies called upon to assist an inquest follow the example of Pinterest, who have taken steps to learn lessons and have engaged sincerely and respectfully with the inquest process.'

Pinterest told Ars that it is 'committed to making ongoing improvements to help ensure that the platform is safe for everyone' and internally 'the Coroner's report will be considered with care.' Since Molly's death, Pinterest said it has taken steps to improve content moderation, including blocking more than 25,000 self-harm related search terms and, since 2019, has combined 'human moderation with automated machine learning technologies to reduce policy-violating content on the platform.'

Read more of this story at Slashdot.
https://tech.slashdot.org/story/22/09/30/2023258/coroner-lists-instagram-algorithm-as-contributing-c...
News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
ven. 26 avril - 04:20 CEST