Alex Atallah has impeccable timing. In January 2022, Forbes pegged his stake in OpenSea at $2.2 billion. Six months later, he walked away from the NFT marketplace he co-founded. A few months after that, OpenSea’s daily trading volume collapsed 99% — from $2.7 billion to $9 million. The crypto crowd called him crazy for leaving. Turns out he was just early to the next thing.
That next thing is OpenRouter, and it just became a unicorn.
The Information reported on April 1st that OpenRouter is in talks to raise $120 million at a $1.3 billion post-money valuation, led by CapitalG — Google’s growth equity fund. That would more than double the company’s valuation from its previous round. The kicker: OpenRouter is now running at $50 million or more in annualized revenue, up from $10 million just six months ago. That’s 5x growth in half a year for what is essentially a routing layer sitting between developers and AI models.
Why 400+ Models Need a Traffic Cop
The pitch is deceptively simple. Instead of integrating separate SDKs for OpenAI, Anthropic, Google, DeepSeek, and dozens of other providers, developers hit one OpenAI-compatible API endpoint and get access to 400+ models from 60+ providers. One API key. One request format. Any model.
OpenRouter charges roughly 5% on top of inference costs for this convenience. That’s the entire business model. No enterprise software, no platform lock-in, no complicated pricing tiers. Just a toll booth on the busiest highway in AI.
And that highway is getting busier fast. According to data from the a16z “State of AI” report — an empirical study based on 100 trillion tokens of real-world usage flowing through OpenRouter — the platform now processes over 30 trillion tokens per month and serves 5 million+ developers globally. To put that in perspective, the platform went from 10 trillion tokens per year to 100 trillion in roughly 12 months. That kind of growth doesn’t happen because the product is nice to have. It happens because the product becomes infrastructure.
The reason is structural. We went from a world where GPT-4 was the only serious option to a world where developers routinely switch between DeepSeek V3.2, Claude Sonnet 4.6, Gemini 3 Flash, Qwen3-Coder, and a dozen others depending on the task. The model landscape fractured, and OpenRouter positioned itself as the aggregation layer before anyone else realized aggregation was the game.
The Numbers Behind the Unicorn
Let’s unpack why CapitalG is writing a nine-figure check.
OpenRouter’s annualized inference spend — meaning the total dollar volume flowing through the platform to model providers — crossed $100 million by mid-2025, up from around $19 million at the end of 2024. At a 5% take rate, that’s how you get to $50M+ ARR without selling a single enterprise contract.
The funding history tells the story of acceleration. Atallah and co-founder Louis Vichy bootstrapped OpenRouter through 2023 and into early 2024. In June 2025, they announced a combined $40 million across a seed round led by Andreessen Horowitz and a Series A led by Menlo Ventures, with Sequoia participating. That valued the company at roughly $500 million. Now, less than a year later, CapitalG is leading a round that values it at $1.3 billion.
From $500M to $1.3B in under a year. From $10M ARR to $50M+ ARR in six months. Those are the kind of numbers that make VCs respond to cold emails.
The a16z connection goes deeper than just funding. Andreessen Horowitz published their State of AI report as a joint research project with OpenRouter, analyzing over 100 trillion tokens of usage data. That’s not something you do with a random portfolio company. That’s a signal that a16z sees OpenRouter as a fundamental piece of AI infrastructure — important enough to base their flagship AI research on.
What the Token Data Actually Reveals
The usage patterns on OpenRouter are a real-time map of where AI is heading, and some of the trends are surprising.
Four of the top five most-used models on OpenRouter are now open-source. DeepSeek V3.2 leads with 1.23 trillion tokens processed, followed by Claude Sonnet 4.6 at 1.04 trillion, Gemini 3 Flash Preview at 997 billion, and Claude Opus 4.6 at 990 billion. The open-source models — including Qwen3-Coder, DeepSeek R2, and MiniMax M2.5 — are eating into proprietary model share at a rate nobody predicted two years ago.
The usage categories tell an equally interesting story. Anthropic’s Claude models are overwhelmingly used for programming and technical work — over 80% of Claude traffic on OpenRouter is code-related. DeepSeek, on the other hand, is dominated by roleplay and creative use cases. Same platform, completely different user bases, routed through the same API.
This fragmentation is exactly why OpenRouter works. No single model dominates every use case. DeepSeek V4 achieves roughly 90% of GPT-5.4‘s performance at 1/50th the cost. For many developers, that trade-off is obvious — but only if you have an easy way to switch between models without rewriting your integration. That’s the OpenRouter thesis in one sentence.
OpenRouter vs. the Competition — and Why It’s Winning
OpenRouter isn’t the only LLM gateway. LiteLLM is the open-source alternative — a Python proxy you self-host on Docker or Kubernetes, with no usage fees. Portkey positions itself as the enterprise-grade option with semantic caching, guardrails, and team-level cost attribution. There’s also Amazon Bedrock, Azure AI, and the cloud providers’ own model aggregation services.
So why is OpenRouter the one raising at a billion-dollar valuation?
Speed to value. You sign up, get an API key, and start making requests in under a minute. No infrastructure to manage, no DevOps team needed, no procurement process. For individual developers and small teams — which is most of the AI development ecosystem — that frictionless onboarding is everything.
LiteLLM gives you full control but demands you run and maintain infrastructure. That’s perfect for companies with platform engineering teams. It’s a non-starter for a solo developer shipping a side project at 2am. (Worth noting: LiteLLM recently had a supply chain security incident that affected 95 million monthly downloads, which didn’t help its enterprise pitch.)
Portkey is more directly competitive, but it’s focused on production tooling — observability, tracing, budget controls. OpenRouter is focused on access. Different problems, different buyers, and right now the “I just need to call a model” crowd is much larger than the “I need to optimize my LLM traffic at scale” crowd.
The network effect matters too. With 5 million developers already on the platform and 400+ models integrated, OpenRouter has the most comprehensive model catalog in the market. New models show up on OpenRouter often within hours of launch. That responsiveness creates a flywheel: developers come for the selection, providers list on OpenRouter to reach those developers, which attracts more developers.
The Bigger Bet: AI Infrastructure’s Middle Layer
What makes OpenRouter interesting beyond the financials is what it represents about the AI stack. We’re watching the emergence of a middleware layer that didn’t exist two years ago.
Think about what happened with cloud computing. AWS didn’t just sell servers — an entire ecosystem of tools grew up around it. Load balancers, CDNs, monitoring services, API gateways. The same thing is happening with AI inference. Model providers are the compute layer. Applications are the consumption layer. And OpenRouter is betting it can own the routing layer in between.
Alex Atallah saw this pattern before. OpenSea wasn’t the first NFT project, but it became the marketplace layer that connected creators to buyers. OpenRouter isn’t the first LLM API, but it’s becoming the routing layer that connects developers to models. Same playbook, different technology wave.
The $1.3 billion question is whether this layer stays valuable as the market matures. If one model eventually dominates everything — the way Google Search dominated web discovery — there’s no need for a router. But the trend line points in the opposite direction. Models are getting more specialized, not less. Open-source is fragmenting the market, not consolidating it. Every new model release makes OpenRouter slightly more essential.
From a $2.2 billion NFT fortune to a $1.3 billion AI infrastructure company, Atallah keeps ending up in the right place at the right time. But this time it’s not just timing. OpenRouter is sitting on a dataset of 100 trillion tokens of real-world AI usage — the kind of data that lets you understand how the entire industry actually works, not just how one model performs. That might be the most valuable asset of all.
You Might Also Like
- Notebooklm py the Unofficial Python sdk That Finally Gives Google Notebooklm a Real api
- 13b Into Openai yet Microsoft Copilot Cowork Runs on Anthropic Claude
- Openai Oauth Turns Your Chatgpt Subscription Into a Free Openai api but Should you use it
- Cursor Composer 2 Takes on Anthropic and Openai With a 0 50 m Token Coding Model and the Benchmarks Back it up
- Google Turboquant Squeezes llm Cache to 3 Bits 6x Less Memory 8x Faster Zero Accuracy Loss

Leave a comment