free-claude-code gives Claude Code to everyone
free-claude-code hit 2,388 new stars in 24 hours, landing near the top of GitHub trending today. Total: 5.4K stars, 503 commits, 17 contributors. What it does: route Claude Code API calls to NVIDIA NIM, OpenRouter, DeepSeek, LM Studio, or llama.cpp, with no Anthropic key required.
The architecture is a transparent proxy that intercepts Anthropic-format requests from the Claude Code CLI or VSCode extension, translates them to OpenAI-compatible calls, and streams responses back. Per-model routing means Opus gets one backend, Sonnet another, Haiku a third. Five categories of trivial API calls get handled locally to save quota and latency. Thinking tokens, heuristic tool parsing, smart throttling, Discord and Telegram bots for remote coding from mobile all included. Fully local inference supported via LM Studio and llama.cpp.
This is one step past what Anthropic tried to sanction three days ago. On April 21 Anthropic officially moved against OpenClaw CLI, signaling that any client not running through their SDK was out of bounds. free-claude-code is the user-side response β and the star velocity says the community cares more about keeping Claude Code usable on any model than about Anthropicβs preferred distribution.
The strategic picture for Anthropic just got complicated. Their interface is popular enough that users will build their own pipes to keep using it even without their models. Whether Anthropic treats this as flattery or infringement will tell us a lot about the next phase of the coding-agent wars. Meanwhile, if you have a local NVIDIA box or a DeepSeek V4 key, you now have a Claude Code front end without a subscription.
https://github.com/Alishahryar1/free-claude-code
← Back to all articles
The architecture is a transparent proxy that intercepts Anthropic-format requests from the Claude Code CLI or VSCode extension, translates them to OpenAI-compatible calls, and streams responses back. Per-model routing means Opus gets one backend, Sonnet another, Haiku a third. Five categories of trivial API calls get handled locally to save quota and latency. Thinking tokens, heuristic tool parsing, smart throttling, Discord and Telegram bots for remote coding from mobile all included. Fully local inference supported via LM Studio and llama.cpp.
This is one step past what Anthropic tried to sanction three days ago. On April 21 Anthropic officially moved against OpenClaw CLI, signaling that any client not running through their SDK was out of bounds. free-claude-code is the user-side response β and the star velocity says the community cares more about keeping Claude Code usable on any model than about Anthropicβs preferred distribution.
The strategic picture for Anthropic just got complicated. Their interface is popular enough that users will build their own pipes to keep using it even without their models. Whether Anthropic treats this as flattery or infringement will tell us a lot about the next phase of the coding-agent wars. Meanwhile, if you have a local NVIDIA box or a DeepSeek V4 key, you now have a Claude Code front end without a subscription.
https://github.com/Alishahryar1/free-claude-code
Comments