VentureBeat June 24, 2024
Ben Dickson

Researchers at the University of Tokyo and Alternative Machine have developed a humanoid robot system that can directly map natural language commands to robot actions. Named Alter3, the robot has been designed to take advantage of the vast knowledge contained in large language models (LLMs) such as GPT-4 to perform complicated tasks such as taking a selfie or pretending to be a ghost.

This is the latest in a growing body of research that brings together the power of foundation models and robotics systems. While such systems have yet to reach a scalable commercial solution, they have propelled robotics research forward in recent years and are showing much promise.

How LLMs control robots

Alter3 uses GPT-4 as the backend model....

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Robotics/RPA, Technology
The case for human-centered AI
European Commission Approves Nvidia’s Proposed Acquisition of Run:ai
How Health Systems Can Collaborate on AI Tools
The Future Talent Equation: How To Identify And Retain Talent In The Age Of AI
A Roadmap For AI In Education: Turning Disruption Into Opportunity

Share This Article