Healthcare IT News October 30, 2024
Andrea Fox

The AP reports that OpenAI’s Whisper documentation platform is prone to hallucinations, and to making up sentences and sections of text across millions of recordings. Tens of thousands of transcriptions could be faulty.

The Associated Press reported recently that it has interviewed more than a dozen software engineers, developers and academic researchers who take issue with a claim by artificial intelligence developer OpenAI that one of its machine learning tools, which is used in clinical documentation at many U.S. health systems, has human-like accuracy.

WHY IT MATTERS

Researchers at the University of Michigan and others found that AI hallucinations resulted in erroneous transcripts – sometimes with racial and violent rhetoric, in addition to imagined medical treatments – according to the...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Technology
Cofactor AI Nabs $4M to Combat Hospital Claim Denials with AI
Set Your Team Up to Collaborate with AI Successfully
What’s So Great About Nvidia Blackwell?
Mayo develops new AI tools
Medtronic, Tempus testing AI to find potential TAVR patients

Share This Article