MedTech Dive February 16, 2023
Nick Paul Taylor

Dive Brief:

  • Academics have called for the Food and Drug Administration (FDA) to create a regulatory process to prevent AI-driven software as a medical device (SaMD) from exacerbating health disparities.
  • Writing in the Journal of Science Policy & Governance, researchers at the University of Pennsylvania and Oregon Health & Science University warn that “AI-driven tools have the potential to codify bias in healthcare settings” but note the FDA lacks regulations to examine bias in AI healthcare software.
  • The researchers want the FDA to create a distinct regulatory process for AI devices and to appoint a panel of experts in “algorithmic justice and healthcare equity” to develop bias benchmarks and requirements.

Dive Insight:

Because AI is trained on existing...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), FDA, Govt Agencies, Medical Devices, Technology
77% of COOs Using GenAI Report Positive ROI, With Customized Tools Leading the Way
Anthropic's Claude: The AI Junior Employee Transforming Business
AI Teaches Surgery; AI Quality Registry; AI Matches Patients With Trials
Alibaba releases Qwen with Questions, an open reasoning model that beats o1-preview
Trump’s New AI Czar: Game-Changer Or A Pandora’s Box?

Share This Article