Fast Company September 22, 2021
A report found that Apple is studying how to ascertain mood through iPhone usage, but emotion AI has a reputation for being pseudoscience.
New information about a current study between UCLA and Apple shows that the iPhone maker is using facial recognition, patterns of speech, and an array of other passive behavior tracking to detect depression. The report, from Rolfe Winkler of The Wall Street Journal, raises concerns about the company’s foray into a field of computing called emotion AI, which some scientists say rests on faulty assumptions.
Apple’s depression study was first announced in August 2020. Previous information about the study suggested the company was using only certain health data points, like heart rate, sleep, and how a person...