April 30, 2026CodingOpen SourceAgents

jcode trends on GitHub: a Rust harness that picks fights with Claude Code on memory

jcode hit GitHub Trending today at +411 stars in 24 hours, sitting at 1,648 total. Built by 1jehuang. The pitch is direct enough to be funny: same coding-agent ergonomics as Claude Code, but Rust under the hood, and it doesn't eat 386 MB of RAM to start.

The baseline-memory line in the README is the kind of swing that lands. jcode runs at 27.8 MB. The author claims competitors hit 214 to 386 MB and take 40 to 245 times longer to launch. Whether the multipliers hold under fair comparison is the kind of thing the HN crowd will pull apart by Monday. The point is that someone built it in Rust on the assumption that the existing harness layer was leaving performance on the floor, and people care enough to star it 411 times in a day.

The architecture choices are where this stops being a Claude Code clone. Memory is a semantic graph of embedded turns β€” every turn embeds and queries cosine-similar past turns, no explicit recall tool calls needed. Multiple agents can spawn in the same repo with a server-side conflict-detection layer that notifies agent B when agent A edits a file it has read. Twenty-plus LLM providers wired in including Claude, OpenAI, Gemini, and local models. Self-modification: agents can rewrite jcode's own source and reload.

The agent-client-layer thesis just got its fifth concrete entry in eight days. Warp open-sourced the Rust client. Devin shipped Terminal. OpenAI hit Bedrock. Mistral wrapped Vibe Remote Agents. jcode is the one nobody had on the bingo card β€” a single-author Rust project at 1.7k stars choosing to compete on memory and multi-agent collaboration instead of feature breadth. v0.11.1 shipped April 28. MIT license. https://github.com/1jehuang/jcode
← Previous
OpenAI's goblin postmortem: how a 2.5% feature poisoned the whole model
Next β†’
Kuaishou's Bian Que: agentic ops at production scale, 75% fewer alerts
← Back to all articles

Comments

Loading...
>_