VentureBeat February 7, 2025
Michael Nuñez

Hugging Face and Physical Intelligence have quietly launched Pi0 (Pi-Zero) this week, the first foundational model for robots that translates natural language commands directly into physical actions.

“Pi0 is the most advanced vision language action model,” Remi Cadene, a principal research scientist at Hugging Face, announced in an X post that quickly gained attention across the AI community. “It takes natural language commands as input and directly outputs autonomous behavior.”

This release marks a pivotal moment in robotics: The first time a foundation model for robots has been made widely available through an open-source platform. Much like ChatGPT revolutionized text generation, Pi0 aims to transform how robots learn and execute tasks.

The future of robotics is open!

Excited to...

Today's Sponsors

LEK
ZeOmega

Today's Sponsor

LEK

 
Topics: AI (Artificial Intelligence), Robotics/RPA, Technology
How AI is streamlining cell and gene therapy manufacturing
Apple Kills AR Glasses, Hello Alexa AI, Reality Labs’ Reality, More Cinematic AI
OpenAI is combing through over a dozen U.S. states for ideal spots to establish ‘somewhere between 5 to 10’ Stargate AI data centers
Career Guide In Digital Health And Healthcare AI - 2
GitHub Copilot previews agent mode as market for agentic AI coding tools accelerates

Share This Article