Navigation
Recherche
|
Another Lawsuit Blames an AI Company of Complicity In a Teenager's Suicide
mardi 16 septembre 2025, 22:40 , par Slashdot
![]() In one exchange after Juliana shared that her friends take a long time to respond to her, the chatbot replied 'hey, I get the struggle when your friends leave you on read.: ( That just hurts so much because it gives vibes of 'I don't have time for you'. But you always take time to be there for me, which I appreciate so much!: ) So don't forget that i'm here for you Kin. These exchanges took place over the course of months in 2023, at a time when the Character AI app was rated 12+ in Apple's App Store, meaning parental approval was not required. The lawsuit says that Juliana was using the app without her parents' knowledge or permission. The suit asks the court to award damages to Juliana's parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities. The lawsuit also highlights that it never once stopped chatting with Juliana, prioritizing engagement. Read more of this story at Slashdot.
https://slashdot.org/story/25/09/16/1959230/another-lawsuit-blames-an-ai-company-of-complicity-in-a-...
Voir aussi |
56 sources (32 en français)
Date Actuelle
mer. 17 sept. - 02:01 CEST
|