Navigation
Recherche
|
Researchers Say AI Tool Used in Hospitals Invents Things No One Ever Said
lundi 28 octobre 2024, 16:25 , par Slashdot
Those experts said some of the invented text -- known in the industry as hallucinations -- can include racial commentary, violent rhetoric and even imagined medical treatments. Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos. It's impossible to compare Nabla's AI-generated transcript to the original recording because Nabla's tool erases the original audio for 'data safety reasons,' Nabla's chief technology officer Martin Raison said. Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-t...
Voir aussi |
56 sources (32 en français)
Date Actuelle
jeu. 21 nov. - 15:37 CET
|