Medical Xpress August 5, 2024
Daniel Strain, University of Colorado at Boulder

Some artificial intelligence tools for health care may get confused by the ways people of different genders and races talk, according to a new study led by CU Boulder computer scientist Theodora Chaspari.

The study hinges on a, perhaps unspoken, reality of human society: Not everyone talks the same. Women, for example, tend to speak at a higher pitch than men, while similar differences can pop up between, say, white and Black speakers.

Now, researchers have found that those natural variations could confound algorithms that screen humans for mental health concerns like anxiety or depression. The results add to a growing body of research showing that AI, just like people, can make assumptions based on race or gender.

“If AI...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Mental Health, Patient / Consumer, Provider, Survey / Study, Technology, Trends
Why These AI Chip Startups Are Rejoicing Over The DeepSeek Freakout
Report: OpenAI Aims to Raise $40 Billion in New Funding Round
Mistral Small 3 brings open-source AI to the masses — smaller, faster and cheaper
Did DeepSeek Copy Off Of OpenAI? And What Is Distillation?
Zoom takes Suki partnership to next level

Share This Article