NBC News January 14, 2023
A chat app used for emotional support used a popular chatbot to write answers for humans to select. Controversy followed.
When people log in to Koko, an online emotional support chat service based in San Francisco, they expect to swap messages with an anonymous volunteer. They can ask for relationship advice, discuss their depression or find support for nearly anything else — a kind of free, digital shoulder to lean on.
But for a few thousand people, the mental health support they received wasn’t entirely human. Instead, it was augmented by robots.
In October, Koko ran an experiment in which GPT-3, a newly popular artificial intelligence chatbot, wrote responses either in whole or in part. Humans could edit the responses...