Top AI Product

Every day, hundreds of new AI tools launch across Product Hunt, Hacker News, and GitHub. We dig through the noise so you don't have to — surfacing only the ones worth your attention with honest, no-fluff reviews. Explore our latest picks, deep dives, and curated collections to find your next favorite AI tool.


Physical Intelligence π0.7 controls UR5e, Franka, and PI Droid from one cloud API

Physical Intelligence dropped π0.7 on April 16. The viral demo: a UR5e bimanual arm folding laundry with zero training data on that exact hardware. The same weights also pull espresso shots, bag groceries, and figure out an air fryer that appeared in training exactly twice.

Dumb arms, cloud brain

π0.7 isn’t a robot — it’s a brain that ships actions to robots. The supported physical rigs are bimanual tabletop setups, Franka research arms, the open-source DROID Franka platform, and Physical Intelligence’s own PI Droid humanoid. None of them run inference locally. The arm hits a cloud endpoint over WebSocket, gets back ~100ms of joint actions, executes the chunk, and pre-fetches the next one mid-motion. Physical Intelligence calls it Real-Time Action Chunking. The hardware stays cheap; the intelligence stays in a data center.

The API any agent can call

The inference stack is open-sourced as openpi. One Python call — policy.infer({images, prompt})["actions"] — returns a chunk of joint trajectories. Send “pick up the fork,” or wire Claude or GPT to emit high-level goals like “clean the kitchen” and let π0.7 stream out the next 100ms of motion. Pre-trained base checkpoints cover 10k+ hours of robot data; per-platform expert checkpoints layer on top for fine-tuning.

On UR5e laundry folding, the generalist π0.7 matched first-attempt human teleoperators and beat the prior π*0.6 RL specialist trained only for that task.


You Might Also Like


Discover more from Top AI Product

Subscribe to get the latest posts sent to your email.



Leave a comment