Healthcare IT News October 30, 2024
The AP reports that OpenAI’s Whisper documentation platform is prone to hallucinations, and to making up sentences and sections of text across millions of recordings. Tens of thousands of transcriptions could be faulty.
The Associated Press reported recently that it has interviewed more than a dozen software engineers, developers and academic researchers who take issue with a claim by artificial intelligence developer OpenAI that one of its machine learning tools, which is used in clinical documentation at many U.S. health systems, has human-like accuracy.
WHY IT MATTERS
Researchers at the University of Michigan and others found that AI hallucinations resulted in erroneous transcripts – sometimes with racial and violent rhetoric, in addition to imagined medical treatments – according to the...