MedCity News January 12, 2021
Elise Reuter

AI tools increasingly occupy a regulatory gray area in healthcare. For clinicians to assess whether they are trustworthy, they need transparency on how they work, said panelists at CES.

A host of AI tools have been quickly developed to respond to the Covid-19 pandemic, from algorithms to screen lung x-rays for signs of Covid-19, to triage tools to predict which patients will become critically ill. But how do we know they are working as planned?

Transparency in how these tools are developed and their intended use is critical, experts said at a virtual panel at CES.

“You can’t just set a piece of software in front of somebody and say, ‘trust me,’ particularly if they need to make decisions based...

Topics: AI (Artificial Intelligence), Conferences / Trade shows, Provider, Technology, Trends
Language AI is really heating up
Incoming White House science and technology leader on AI, diversity, and society
'Slippery slope territory': Health officials propose waiving regulatory review of medical AI tools
Five highlights from FDA’s new AI device regulation Action Plan
FDA action plan puts focus on AI-enabled software as medical device

Today's Sponsors

SalesSparx - Sell More - Faster
Canton & Company

Today's Sponsors


Today's Sponsor