Tencent rebuilt its Hunyuan training infra in February. Three months later they shipped Hy3 Preview — 295B total parameters, 21B active, 256K context — and dropped the weights on Hugging Face, ModelScope and GitCode on April 23.
What it actually is
A frontier-scale Mixture-of-Experts model. Not a chat product, not an agent — raw weights. Built for agentic workflows, coding, and long-document tasks, with benchmarks aimed at DeepSeek-V4 and Qwen’s open MoE line.
Not theoretical either. Hy3 Preview already serves production traffic inside Yuanbao, CodeBuddy, WorkBuddy, ima, Tencent Docs and Peacekeeper Elite. Token usage 10x’d in the two weeks after launch.
How to actually use it
Tencent Cloud TokenHub and OpenRouter both expose Hy3 Preview through a standard chat-completions API. Pricing is aggressive — ¥1.2 per million input tokens, ¥4 per million output — and the first two weeks ship with free credits. Cheap enough to benchmark against Qwen3, DeepSeek and Kimi on your own workload.
Typical use cases: long-context agent loops, repo-scale code review, document QA across hundreds of pages.
Why it matters: this is the first full open-source MoE from Tencent’s rebuilt infra. Open weights plus a Chinese big-tech badge — a combination the West currently can’t match.
You Might Also Like
- Ggml Llama cpp Joins Hugging Face and Honestly it was Only a Matter of Time
- Pollen Robotics Reachy Mini a 299 Desktop Humanoid That Runs 1 7m Hugging Face Models
- Title x Square Robot Wall a With Wall oss 276m From Xiaomi 62 dof Weights on Hugging Face
- Airbyte Agents Context Store Launches With 50 Connectors and 75 90 Token Savings vs Vendor Mcps
- Qwen Image 2 0 Just Dropped and i Honestly Wasnt Expecting This

Leave a comment