NBC News January 14, 2023
David Ingram

A chat app used for emotional support used a popular chatbot to write answers for humans to select. Controversy followed.

When people log in to Koko, an online emotional support chat service based in San Francisco, they expect to swap messages with an anonymous volunteer. They can ask for relationship advice, discuss their depression or find support for nearly anything else — a kind of free, digital shoulder to lean on.

But for a few thousand people, the mental health support they received wasn’t entirely human. Instead, it was augmented by robots.

In October, Koko ran an experiment in which GPT-3, a newly popular artificial intelligence chatbot, wrote responses either in whole or in part. Humans could edit the responses...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Apps, Digital Health, Mental Health, Patient / Consumer, Provider, Technology
Rady Children’s Embeds Mental Health Into Pediatric Primary Care
InStride Health nabs $30M for virtual pediatric mental health
AI's not ready for depression diagnoses
Fort Health Brings Collaborative Virtual Pediatric Mental Health Care to 450+ Primary Care Providers
Behavioral Healthcare in the Classroom – Understanding How Autism Care Operates Between Systems

Share This Article