NBC News January 14, 2023
David Ingram

A chat app used for emotional support used a popular chatbot to write answers for humans to select. Controversy followed.

When people log in to Koko, an online emotional support chat service based in San Francisco, they expect to swap messages with an anonymous volunteer. They can ask for relationship advice, discuss their depression or find support for nearly anything else — a kind of free, digital shoulder to lean on.

But for a few thousand people, the mental health support they received wasn’t entirely human. Instead, it was augmented by robots.

In October, Koko ran an experiment in which GPT-3, a newly popular artificial intelligence chatbot, wrote responses either in whole or in part. Humans could edit the responses...

Today's Sponsors

Venturous
Got healthcare questions? Just ask Transcarent

Today's Sponsor

Venturous

 
Topics: AI (Artificial Intelligence), Apps, Digital Health, Mental Health, Patient / Consumer, Provider, Technology
Digital mental health programs are inexpensive and innovative. But do they work?
The High Price of Treatment Resistant Depression for Employers
Mental Health Decline Drives Rise in Deaths and Disability
Meet 10 Female Behavioral Health Founders Getting Props for Their Work in 2025
Study finds more extremes in adolescent mental health

Share This Article