Top AI Product

We track trending AI tools across Product Hunt, Hacker News, GitHub, and more  — then write honest, opinionated takes on the ones that actually matter. No press releases, no sponsored content. Just real picks, published daily.  Subscribe to stay ahead without drowning in hype.


AstrBot Crosses 22K GitHub Stars as Developers Flock to Its 18-Platform AI Chatbot Framework

If you’ve ever tried to deploy an AI chatbot across multiple messaging platforms, you know the pain. QQ has its own protocol. WeChat Work needs a different integration. Telegram has its Bot API. Feishu, DingTalk, Slack, Discord — each one demands its own adapter, its own auth flow, its own message format. Building a single AI assistant that works across all of them usually means maintaining a dozen separate codebases, or giving up and picking just one platform.

AstrBot is an open-source project that decided to solve this by building a unified infrastructure layer. Write your bot logic once, connect it to 18+ messaging platforms, plug in any LLM provider you want, and manage everything from a single WebUI. The project just crossed 22,700 GitHub stars, ranked #6 on GitHub Trending this week with a trend score of 9,889, and has been picked up by accounts like @PythonHub on Twitter/X. For a project that started in the Chinese developer community, its growth has been remarkably fast.

What AstrBot Actually Does

At its core, AstrBot is a Python-based chatbot infrastructure framework. It sits between your LLM providers on one side and your messaging platforms on the other, handling the translation layer so you don’t have to.

Here’s what that means in practice:

LLM flexibility. AstrBot supports OpenAI, Anthropic Claude, Google Gemini, DeepSeek, Moonshot AI, Zhipu AI, and self-hosted models through Ollama and LM Studio. It also integrates with LLMOps platforms like Dify, Alibaba Cloud Bailian, and Coze, so if you’ve already built agent workflows on those platforms, AstrBot can serve as the delivery layer.

Multimodal support. Beyond text, AstrBot handles image processing, speech-to-text (via Whisper and SenseVoice), and text-to-speech (via OpenAI TTS, Gemini TTS, Edge TTS, and others). You can build voice-enabled bots or image-understanding assistants without bolting on separate services.

Knowledge base and persona. You can configure a custom persona for your bot and attach a knowledge base for RAG (retrieval-augmented generation). The auto context compression feature keeps conversations manageable without losing important context — a practical solution for long-running chat sessions.

Skills system. AstrBot has a Skills framework that lets you encapsulate specific capabilities — web search, code execution, structured data extraction — into reusable modules that the bot can invoke during conversations.

18+ Platforms, One Bot

The platform coverage is where AstrBot stands out most clearly. Here’s the full list of officially supported integrations:

  • Chinese platforms: QQ (via OneBot v11), WeChat Work (applications and intelligent robots), WeChat Official Accounts, WeChat customer service, Feishu (Lark), DingTalk
  • International platforms: Telegram, Discord, Slack, LINE, WhatsApp (coming soon)
  • Protocol-based: Satori, Misskey
  • Community-maintained: Matrix, KOOK, VoceChat

For teams operating in China, this coverage is hard to match. QQ alone has hundreds of millions of active users. WeChat Work is the default enterprise communication tool for countless Chinese companies. Feishu and DingTalk are the two dominant enterprise collaboration platforms. No other open-source chatbot framework covers all of these with official, maintained adapters.

For international users, the Telegram, Discord, and Slack integrations are solid, and WhatsApp support is on the roadmap. The Satori protocol support also means AstrBot can theoretically connect to any platform that implements the Satori standard.

The Agent Sandbox: Running Code Without the Risk

One of AstrBot’s more interesting technical features is the Agent Sandbox — an isolated execution environment where AI agents can run Python code, execute shell commands, and perform file operations without touching the host system.

The sandbox comes in two implementations. The original “Shipyard” driver and the newer “Shipyard Neo,” which uses container-based isolation with profile-based configuration. Shipyard Neo supports Skills self-iteration, meaning agents can refine their own capabilities during a session.

Why does this matter? When you give an LLM the ability to execute code — which is increasingly what “agentic” means in practice — you need guardrails. AstrBot’s approach uses session-level resource reuse within isolated containers, so each conversation gets its own sandbox that persists across turns but can’t affect other sessions or the host.

Combined with MCP (Model Context Protocol) support, this creates a framework where agents can not only chat but take actions: search the web, process files, query databases, or call external APIs, all within a controlled environment. AstrBot also provides whitelist management, giving administrators complete control over what the agent can and cannot do.

AstrBot vs. the Competition

The open-source chatbot space has several notable players. Here’s how AstrBot compares:

AstrBot vs. OpenClaw. OpenClaw is the 800-pound gorilla in the open-source agent space with 142,000+ GitHub stars. But the two projects serve fundamentally different purposes. OpenClaw is action-first — it’s built for autonomous task execution. AstrBot is conversation-first — it’s built for deploying AI chat assistants across messaging platforms. OpenClaw runs on JavaScript/Node.js and excels at workflows and task automation. AstrBot runs on Python and excels at multi-platform chat delivery. If you need an agent that books flights and manages calendars, OpenClaw is your tool. If you need an AI assistant that works in your company’s Feishu groups, DingTalk channels, and QQ communities simultaneously, AstrBot is the better fit.

AstrBot vs. Dify. This comparison comes up frequently, but they’re actually complementary rather than competitive. Dify is a visual AI workflow builder — think of it as the brain. AstrBot is the delivery layer — think of it as the mouth. You can build sophisticated agent workflows in Dify and then use AstrBot to deploy them across messaging platforms. AstrBot’s native Dify integration makes this straightforward.

AstrBot vs. NanoBot. NanoBot is the lightweight alternative: 4,000 lines of Python, pip install, 8 messaging channels including some Chinese platforms. If AstrBot feels like overkill for your use case, NanoBot offers a simpler path — but with a much smaller plugin ecosystem and fewer platform integrations.

AstrBot vs. LangBot. LangBot is AstrBot’s closest direct competitor, targeting similar use cases with similar platform support. Both support QQ, WeChat, Telegram, and Feishu. AstrBot currently has more GitHub stars and a larger plugin ecosystem (1,000+ vs. LangBot’s smaller collection), plus a simpler setup process according to community comparisons.

1,000+ Plugins and a WebUI That Actually Works

AstrBot’s plugin ecosystem has crossed the 1,000 mark, with a dedicated plugin marketplace for browsing and one-click installation. Plugins extend the bot’s capabilities — from custom commands and mini-games to integrations with specific services and data sources.

The built-in WebUI handles bot configuration, plugin management, conversation monitoring, and platform connection setup. There’s also a ChatUI for direct interaction and testing. For developers who prefer API-level access, AstrBot exposes Webhook and HTTP API endpoints.

Deployment is flexible. The fastest path is the uv one-click install (a single command-line invocation). Docker and Docker Compose are available for production setups. There’s a desktop application for local use, panel-based deployment options (BT-Panel, 1Panel, CasaOS for NAS devices), and even an AUR package for Arch Linux users. The project has also been packaged on PyPI, so pip install AstrBot works too.

Who’s Using AstrBot and Why

AstrBot’s user base falls into a few distinct categories:

Community managers who need an AI assistant present across multiple platforms where their community lives — a common scenario in Chinese tech communities that span QQ groups, WeChat groups, and Telegram channels simultaneously.

Small businesses that want AI-powered customer service across WeChat Official Accounts, DingTalk, and other platforms without paying for enterprise chatbot solutions. Since AstrBot is free under the AGPL-3.0 license, the only cost is hosting and LLM API usage.

Developers and hobbyists building personal AI assistants with custom personas, knowledge bases, and skills. The plugin system and persona configuration make this accessible without deep technical knowledge.

Enterprise teams using it as middleware to connect their existing Dify or Coze workflows to internal communication platforms like Feishu and WeChat Work.

The project maintains 10 official QQ groups and an active Discord server, with documentation available in English, Chinese, Japanese, French, and Russian — reflecting its increasingly international user base.

FAQ

Is AstrBot free?
Yes. AstrBot is a non-profit open-source project licensed under AGPL-3.0. The software itself is completely free. Your only costs are hosting (if you deploy on a server) and LLM API fees from whichever provider you choose. You can also use free local models via Ollama or LM Studio to eliminate API costs entirely.

What LLM providers does AstrBot support?
AstrBot works with OpenAI (and compatible APIs), Anthropic Claude, Google Gemini, DeepSeek, Moonshot AI, Zhipu AI, and self-hosted models through Ollama and LM Studio. It also connects to LLMOps platforms including Dify, Alibaba Cloud Bailian, and Coze. Support for OneAPI and SiliconFlow is also available as API gateway options.

How does AstrBot compare to building separate bots for each platform?
The main advantage is unified management. Instead of maintaining separate bot codebases for QQ, WeChat, Telegram, etc., you configure all platforms from one WebUI, share the same LLM backend, knowledge base, and plugin set across all of them, and manage conversations from a single dashboard. For teams operating across 3+ platforms, this reduces maintenance overhead significantly.

Can AstrBot handle enterprise workloads?
AstrBot is designed for both personal and enterprise use. Docker/Docker Compose deployment, the Agent Sandbox for safe code execution, whitelist-based access control, and integration with enterprise platforms like WeChat Work, Feishu, and DingTalk all point toward enterprise readiness. However, as an open-source project, you’ll need to handle scaling, monitoring, and SLA guarantees yourself — there’s no commercial support tier currently available.

What programming knowledge do I need to use AstrBot?
For basic setup — connecting platforms, configuring LLM providers, installing plugins — the WebUI handles most tasks without code. For custom plugin development or advanced configuration, Python knowledge is needed. The project’s documentation covers both paths, and the 1,000+ existing plugins mean many common use cases are already covered.


You Might Also Like


Discover more from Top AI Product

Subscribe to get the latest posts sent to your email.



Leave a comment