VentureBeat July 10, 2020
Kyle Wiggers

Google is investigating ways AI might be used to ground natural language instructions to smartphone app actions. In a study accepted to the 2020 Association for Computational Linguistics (ACL) conference, researchers at the company propose corpora to train models that would alleviate the need to maneuver through apps, which could be useful for people with visual impairments.

When coordinating efforts and accomplishing tasks involving sequences of actions — for example, following a recipe to bake a birthday cake — people provide each other with instructions. With this in mind, the researchers set out to establish a baseline for AI agents that can help with similar interactions. Given a set of instructions, these agents would ideally predict a sequence of app...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Apps, Digital Health, mHealth, Technology
Test of 'poisoned dataset' shows vulnerability of LLMs to medical misinformation
Anthropic’s chief scientist on 5 ways agents will be even better in 2025
3 AI Myths That 2025 Will Debunk
Why Everyone — From Technologists To Creatives —Needs To Guide AI Design
AI tool assists doctors in sharing lab results

Share This Article