Even Realities just turned its G2 glasses into a platform. 50 apps at launch, 2,000+ devs already building. That’s more than Meta or Snap have put in front of real users.
What the hardware actually is
Prescription AI glasses. Monochrome micro-LED green HUD projected onto the lens, paired with the R1 functional ring for silent input. Light enough to wear all day — no bulky pods, no camera array on your forehead. It’s not VR, not notification-only. It’s a small reading surface that lives on your face and disappears when the HUD is off.
The SDK is the real story
Even Hub exposes sensors, HUD text rendering, R1 ring gestures, and voice through a specialized API. Developers push apps over-the-air to the glasses. Day-one catalog covers teleprompter, real-time translation, turn-by-turn navigation, productivity, and wellness tracking. Any AI agent can now render text on the lens — a language model writing live captions during a conversation, or a background agent dropping a line on the HUD the moment a long task finishes.
Meta’s Wearables Toolkit is still in preview. Snap Specs has an SDK but the glasses are heavy and niche. Even Realities skipped the spectacle and shipped something you’d actually wear — and it’s the first consumer smart-glasses platform where third-party devs can actually reach users.
You Might Also Like
- Ring app Store Amazon Opens its 100m Camera Fleet to Third Party Apps
- Snap Specs Signs Multiyear Qualcomm Deal Consumer ar Glasses With Openai and Gemini Ship h2 2026
- Yarbo sam Module Open Platform the First Outdoor Robot With an Open api for Developers
- Meta Just Cracked Open Whatsapp for Third Party ai Chatbots but Theres a Catch
- Notebooklm py the Unofficial Python sdk That Finally Gives Google Notebooklm a Real api

Leave a comment