OpenAI just launched a public competition that feels more like a hacker challenge than a corporate event. The premise is deceptively simple: build the best language model you can, but it has to fit — code, weights, and all — into 16 megabytes. That’s smaller than most smartphone photos. And your total training budget? Ten minutes on eight H100 GPUs. No more.
The challenge is called Parameter Golf, a nod to code golf where the goal is to solve problems in the fewest characters possible. Except here, the “fewer” applies to model parameters, and the stakes include $1 million in compute credits and, for standout participants, a job interview at OpenAI.
The Rules: 16MB, 10 Minutes, No Excuses
Parameter Golf launched on March 18, 2026, and runs through April 30. The constraints are strict and clearly defined:
- Total artifact size: 16,000,000 bytes maximum. This includes your training code and compressed model weights combined.
- Training time: Under 10 minutes on 8×H100 GPUs for leaderboard-eligible submissions.
- Evaluation metric: Bits per byte (BPB) on the FineWeb validation set — a tokenizer-agnostic measure, meaning you can use whatever tokenizer you want, and the scoring stays fair.
- Dataset: FineWeb with a 1024-token vocabulary, providing 80 training shards totaling 8 billion tokens.
- No network calls: During evaluation, your model cannot download anything. Everything must be self-contained in that 16MB package.
To participate, you fork the openai/parameter-golf GitHub repository, optimize your model within the constraints, and submit a pull request with your code, training logs, score, and a short write-up. For a submission to count as a new record, it must beat the current state-of-the-art by at least 0.005 nats with statistical significance (p < 0.01).
The naive baseline — a 9-layer, 512-dimension transformer with tied embeddings and 4 KV heads — scores 1.2244 BPB. An unconstrained 4-hour run by OpenAI researcher Will DePue hits 1.2074 BPB, showing that there’s real room to improve even within the tight constraints.
Why This Competition Actually Matters
On the surface, Parameter Golf looks like a fun side project. But there’s a calculated strategy behind it — one that addresses several things OpenAI clearly cares about right now.
Talent scouting in disguise. OpenAI plans to hire a small group of junior researchers in June 2026, targeting students, recent graduates, and Olympiad medalists. Will DePue, who runs a research team at OpenAI, dropped out of college in 2022 after selling a company he’d co-founded in high school. Several people on his team don’t have formal ML education either. Parameter Golf is designed to surface exactly this kind of unconventional talent — people who can squeeze remarkable performance out of extreme constraints, regardless of their pedigree.
Competing for researchers in a brutal market. The Decoder reports that Meta has “repeatedly poached top researchers from OpenAI” with compensation packages reportedly reaching $300 million. Parameter Golf is a lower-cost way to identify and attract promising researchers before the bidding wars start.
The science of compression is genuinely useful. Making models smaller and more efficient isn’t just an academic exercise. Edge deployment, mobile AI, and cost reduction all depend on the same skills this competition tests. The techniques participants develop — quantization-aware training, aggressive parameter tying, depth recurrence, novel tokenizers — have direct applications in production systems.
From NanoGPT Speedrunning to Parameter Golf
Parameter Golf didn’t come out of nowhere. It’s a direct descendant of the NanoGPT Speedrunning challenge, a community-driven competition popularized by Andrej Karpathy (who, incidentally, has been covered on this site before). In that challenge, participants race to train a 124M-parameter GPT-2 model to a target validation loss as fast as possible on 8×H100s. The community has pushed training from 10 billion tokens down to 2.7 billion — a 3.8× improvement in token efficiency.
Where NanoGPT Speedrunning optimizes for time (how fast can you reach a fixed loss?), Parameter Golf flips the axis: it optimizes for size (how good a model can you build within a fixed parameter budget?). The GitHub repo explicitly frames this as “L(N) optimization — achieving lowest loss given fixed parameter count, unconstrained by data, compute, or architecture.”
This is a meaningful distinction. The AI industry has spent the last few years chasing bigger models with more parameters and more compute. Parameter Golf pushes in the opposite direction, asking: what’s the minimum you actually need?
What Participants Get (Beyond Bragging Rights)
OpenAI, in partnership with RunPod, is offering $1 million in compute credits to help participants train their models. You can request credits through OpenAI’s challenge participant form. For context, an 8×H100 pod on RunPod costs roughly $20/hour, so the credits go a meaningful distance.
The repo also provides multiple entry points to lower the barrier:
- Local development on Apple Silicon: An MLX training script lets you iterate on a Mac before submitting to the H100 leaderboard.
- RunPod templates: Pre-configured environments with all dependencies installed, available for both 1×H100 and 8×H100 setups.
- Discord support: Dedicated channels (#parameter-golf-discussions and #parameter-golf-announcements) on the OpenAI Discord server.
For the top performers, OpenAI is dangling something potentially more valuable than compute credits: job interviews. The company has been explicit that this is a recruiting pipeline, not just a competition. If your approach is creative enough, it may also be featured publicly, which is its own kind of career capital in the ML community.
How Parameter Golf Compares to Other AI Competitions
Parameter Golf occupies an unusual niche in the AI competition landscape. Here’s how it stacks up:
| Competition | Focus | Constraints | Organizer |
|---|---|---|---|
| Parameter Golf | Model compression (min loss in 16MB) | 16MB total, 10 min on 8×H100 | OpenAI |
| NanoGPT Speedrun | Training speed (min time to target loss) | Fixed model size (124M), 8×H100 | Community / Karpathy |
| SWE-bench | Code generation accuracy | None (model size) | Princeton |
| Kaggle competitions | Varies (prediction tasks) | Varies | |
| MLPerf | Training/inference benchmarks | Hardware-specific categories | MLCommons |
Most AI competitions today focus on accuracy at scale — throw more parameters and more data at the problem. Parameter Golf and NanoGPT Speedrun are unusual in that they reward efficiency and cleverness over brute force. And Parameter Golf is the only major competition specifically backed by a frontier AI lab that doubles as a hiring funnel.
The GitHub repository already has 604 stars and 360 forks within its first day, suggesting strong community interest. The competition encourages a wide range of technical approaches: test-time compute, parameter tying, low-rank training, bitnets, novel compression schemes, and creative uses of the limited vocabulary.
The Bigger Picture: OpenAI’s Shifting Priorities
Parameter Golf is part of a broader pattern at OpenAI. The company has been increasingly active in open-source and community engagement — from releasing Symphony, their agentic coding framework, to launching GPT-OSS open-weight models under Apache 2.0. After years of criticism for being “open” only in name, these moves suggest a genuine strategic shift.
Parameter Golf fits this narrative perfectly. The competition is fully open-source (MIT license), the leaderboard is public, and winning approaches will be shared with the community. It’s the kind of initiative that builds goodwill while simultaneously serving OpenAI’s business interests — finding researchers who think differently about model efficiency.
Whether this signals a lasting commitment to openness or just a tactical move in the talent wars remains to be seen. But for anyone with GPU access and ideas about model compression, the next six weeks offer a rare chance to get OpenAI’s attention — without needing a Stanford PhD or a $300 million poaching offer from Meta.
FAQ
How much does it cost to participate in OpenAI Parameter Golf?
Nothing upfront. OpenAI and RunPod are providing $1 million in shared compute credits. You can request credits through the challenge participant form. If you have Apple Silicon hardware, you can also develop locally using the provided MLX training script before submitting to the H100 leaderboard.
Who is eligible for the Parameter Golf challenge?
Anyone 18 or older in supported countries can participate. OpenAI is specifically targeting early-career researchers, undergraduate students, recent graduates, and Olympiad medalists, but the competition is open to anyone who meets the age and location requirements.
What programming language and tools are required?
The baseline is written in Python, and submissions must include a training script. The repo provides evaluation scripts, a fixed dataset, and pre-configured RunPod templates. All submissions must be open-source under the MIT license and hosted in a public GitHub repository.
Can I use pre-trained weights or external data?
No. The model must train from scratch within the 10-minute window on the provided FineWeb dataset. No external downloads or network calls are allowed during evaluation. Everything — code and weights — must fit within the 16MB limit.
How does Parameter Golf differ from the NanoGPT Speedrunning challenge?
NanoGPT Speedrunning optimizes for training speed — reaching a fixed validation loss target as quickly as possible with a 124M-parameter model. Parameter Golf optimizes for model quality under a fixed size constraint — achieving the lowest possible loss within 16MB. They test complementary skills: one rewards engineering speed, the other rewards architectural creativity and compression expertise.
You Might Also Like
- Openai Symphony Finally a Framework That Lets you Stop Babysitting Your Coding Agents
- Mcp2cli the Tool That Cuts mcp Token Costs by 99 Just hit Hacker News
- 685 Hacker News Upvotes in one day why Canirun ai Struck a Nerve With Local ai Enthusiasts
- 27k Github Stars in Weeks Learn Claude Code by Shareai lab Breaks Down ai Coding Agents Into 12 Lessons
- Cloudrouter Gives Your ai Coding Agent its own Cloud Machine and Thats a big Deal

Leave a comment