Politico January 8, 2025
Daniel Payne and Carmen Paun

TECH MAZE

Artificial intelligence and humans can both be biased — but together, their biases can create a feedback loop, according to new research.

A feedback loop occurs when a system’s output is fed back into the system as input, which can then influence future output.

A human user’s biases can grow when using a biased AI system — more so than when humans work with humans who hold similar biases, the study, published in Nature Human Behaviour, found.

For example, people interacting with biased AI systems were more likely to underestimate women’s performance or overestimate men’s probability of holding a high-status job.

That can create a “potential snowball effect” of bias that further amplifies human behavior, one researcher said...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Patient / Consumer, Technology
The Crawl, Walk And Run: A Look At Phases Of AI In The Enterprise
Novo Nordisk Adds Obesity and Diabetes to AI Drug Research Pact With Flagship’s Valo Health
Transcarent to acquire fellow health benefits navigator Accolade for $621M
What AGI hype means for Washington
Clinical trial trends in 2025: Investment headwinds, wearables, and targeted AI uses

Share This Article