Hesai announced the EXT on April 18, ahead of the Beijing Auto Show. It’s the first automotive lidar that outputs 3D points with color baked in at the pixel level, in one scan. No more stitching lidar and camera frames downstream.
What’s actually in the box
The trick is the Picasso SPAD-SoC — Hesai calls it a 6D chip because every point carries distance plus RGB. Photon detection efficiency tops 40%. The follow-on ETX series scales channel count: 1,080, 2,160, or 4,320 lines, up to 4K resolution. Mass production H2 2026, flagship car debut 2027.
If one sensor can read a red light, you don’t need a separate forward camera for semantics. That’s a perception-stack rewrite, which is why autonomous-driving Twitter lit up.
How an AI agent actually talks to it
Hesai ships ROS 1/2 drivers, the PandarSDK, and the Hesai Lidar SDK, with C++ and Python bindings. Colored point clouds stream directly into NVIDIA DriveOS and Isaac pipelines — Hesai is already the main lidar supplier for NVIDIA’s ADAS platform, so integration is mostly done. Any agent doing traffic-light recognition or drivable-area segmentation gets color as a native channel, not a post-hoc overlay.
You Might Also Like
- Anvil Robotics Openarm the 5000 Dual arm Robot kit Already Shipping to Nvidia
- Sharpa Robotics North Sharpawave Hand 22 dof 1000 Tactile Pixels per Fingertip 54 Training Boost With Nvidia Gr00t
- Ricursive Intelligence Just Raised 335m in 4 Months and i Think Nvidia Should be Paying Attention
- Nvidia Personaplex Just Made Every Voice ai Pipeline Feel Obsolete
- Nvidia Dreamdojo Teaching Robots to Think by Watching 44000 Hours of us

Leave a comment