VentureBeat July 10, 2020
Google is investigating ways AI might be used to ground natural language instructions to smartphone app actions. In a study accepted to the 2020 Association for Computational Linguistics (ACL) conference, researchers at the company propose corpora to train models that would alleviate the need to maneuver through apps, which could be useful for people with visual impairments.
When coordinating efforts and accomplishing tasks involving sequences of actions — for example, following a recipe to bake a birthday cake — people provide each other with instructions. With this in mind, the researchers set out to establish a baseline for AI agents that can help with similar interactions. Given a set of instructions, these agents would ideally predict a sequence of app...