MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
said
Recherche

Researchers Say AI Tool Used in Hospitals Invents Things No One Ever Said

lundi 28 octobre 2024, 16:25 , par Slashdot
AmiMoJo shares a report: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near 'human level robustness and accuracy.' But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers.

Those experts said some of the invented text -- known in the industry as hallucinations -- can include racial commentary, violent rhetoric and even imagined medical treatments. Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

It's impossible to compare Nabla's AI-generated transcript to the original recording because Nabla's tool erases the original audio for 'data safety reasons,' Nabla's chief technology officer Martin Raison said.

Read more of this story at Slashdot.
https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-t...

Voir aussi

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Date Actuelle
dim. 22 déc. - 11:46 CET