NBC News January 14, 2023
David Ingram

A chat app used for emotional support used a popular chatbot to write answers for humans to select. Controversy followed.

When people log in to Koko, an online emotional support chat service based in San Francisco, they expect to swap messages with an anonymous volunteer. They can ask for relationship advice, discuss their depression or find support for nearly anything else — a kind of free, digital shoulder to lean on.

But for a few thousand people, the mental health support they received wasn’t entirely human. Instead, it was augmented by robots.

In October, Koko ran an experiment in which GPT-3, a newly popular artificial intelligence chatbot, wrote responses either in whole or in part. Humans could edit the responses...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Apps, Digital Health, Mental Health, Patient / Consumer, Provider, Technology
Mental health organization taps Ascension executive as CEO
Where mental health ranks among Americans' healthcare priorities: 3 findings
Employers Reap $190 for Every $100 Invested in Behavioral Health
The growing movement to destigmatize mental health in nursing licensure
Little Otter Raises $9.5M for Family Mental Health

Share This Article