The TL;DR
On March 31 2026, Anthropic accidentally shipped the entire source code of Claude Code — 512,000 lines of TypeScript across 1,906 files — to the public npm registry. The cause was a single misconfigured .npmignore file that failed to exclude source map (.map) files from the published package. Those source maps pointed to a publicly accessible .zip file hosted on Anthropic's own Cloudflare R2 bucket, containing the full human-readable source.
Within hours, the code had been downloaded, forked, mirrored, and was generating GitHub repositories faster than DMCA notices could be filed. The fastest repo in history to hit 50,000 stars did so in under two hours. Anthropic's reaction: pull the npm package, issue the standard "human error, not a security breach" statement, and start sending DMCA takedowns.
By the numbers: 512,000+ lines of code, 1,906 TypeScript files, 59.8 MB source map, 44 hidden feature flags, $2.5 billion Claude Code ARR, 16 million views on the original discovery tweet.
The Technical Root Cause
This is both the most embarrassing and most instructive part of the story.
When you publish a JavaScript or TypeScript package to npm, your build toolchain can optionally generate source map files (ending in .map). These files exist solely for debugging — they map the minified production bundle back to the original readable source. A source map should never, ever ship to users.
The standard way to exclude them is with an .npmignore file, or a files field in package.json. Claude Code's .npmignore was missing the entry for *.map files. So when the package was built and published, the 59.8 MB source map went along with it.
But here is the part that made this a catastrophic exposure rather than a minor embarrassment: the source map did not contain the source code directly. It referenced it, pointing to a URL of a .zip file hosted on Anthropic's own Cloudflare R2 storage bucket — and that bucket was publicly accessible with no authentication required.
npm install @anthropic-ai/claude-code
-> downloads package including main.js.map (59.8 MB)
-> .map file contains URL pointing to src.zip
-> src.zip is on Anthropic R2 bucket, publicly accessible
-> anyone can download and unzip 512,000 lines of TypeScript
Two separate configuration failures, stacked on top of each other. Either one alone would have been fine. Together, they exposed everything.
The Bun Factor
There is a third layer. Anthropic acquired the Bun JavaScript runtime at the end of 2025, and Claude Code is built on it. A known Bun bug (GitHub issue #28001, filed March 11 2026) reports that source maps are served in production builds even when the documentation says they should not be. That bug was open for 20 days before this happened. Anthropic's own acquired toolchain contributed to exposing Anthropic's own product.
The Timeline
- 00:21 UTC, March 31: Malicious axios versions (1.14.1 / 0.30.4) appear on npm with an embedded Remote Access Trojan. Entirely unrelated to Anthropic, but catastrophically bad timing.
- ~04:00 UTC: Claude Code v2.1.88 is pushed to npm. The 59.8 MB source map ships with it. The R2 bucket is live and publicly accessible.
- 04:23 UTC: Chaofan Shou (@Fried_rice), an intern at Solayer Labs, tweets the discovery with a direct download link. 16 million people see it within hours.
- Next 2 hours: GitHub repositories spring up. 41,500+ forks. DMCA requests begin.
- ~08:00 UTC: Anthropic pulls the npm package. Issues the "human error, not a security breach" statement.
- Same day: A Python clean-room rewrite appears, legally DMCA-proof. Decentralised mirrors on Gitlawb go live with "Will never be taken down." The code is permanently in the wild.
Security Alert: The axios RAT
Coinciding with the leak, but entirely unrelated to it, was a real supply chain attack on npm. Malicious versions of the widely-used axios HTTP library were published simultaneously:
axios@1.14.1axios@0.30.4
Both contain an embedded Remote Access Trojan called plain-crypto-js. If you ran npm install or updated Claude Code between 00:21 UTC and 03:29 UTC on March 31 2026, check your lockfiles immediately:
grep -r "1.14.1\|0.30.4\|plain-crypto-js" package-lock.json
grep -r "1.14.1\|0.30.4\|plain-crypto-js" yarn.lock
grep -r "1.14.1\|0.30.4\|plain-crypto-js" bun.lockb
If you find a match: treat the machine as fully compromised, rotate all credentials immediately, and consider a clean OS reinstall.
What Was Actually Exposed
The leaking researchers who dug into the source code found more than just implementation details:
- 44 hidden feature flags — features Anthropic had not publicly announced, including what appeared to be experimental agent swarm capabilities and background daemon processes
- Claude Code's full architecture — including its agent loop implementation, tool-use system, and session management layer
- Internal API integrations — how the desktop app communicates with Anthropic's backend services
- The Tamagotchi — one of the more surreal discoveries: an internal virtual pet feature, apparently used for testing the model's ability to maintain long-running stateful interactions
The uncomfortable question: Was it really an accident? The timing — coinciding with a major product announcement week — led some observers to wonder. Anthropic's statement was definitive: human error, not a strategic leak. The evidence supports that conclusion, but the optics were unfortunate.
The DMCA Takedown Spiral
Anthropic began issuing DMCA takedown notices to GitHub repositories containing the leaked code. By April 1, they had filed over 8,000 takedown requests. This created a further storm: the company was using copyright law to remove code that had been accidentally published to a public registry — code that arguably should never have been public in the first place.
The legal basis is questionable. More practically, the code is already distributed across decentralised mirrors, torrents, and multiple jurisdictions beyond U.S. court jurisdiction. The takedown effort is theatrical at this point rather than remedial.
What This Means for Developers
For developers who use Claude Code: your workflows and session data were not exposed. The leak was source code, not user data.
For the broader industry: this is a reminder that build pipeline security is often an afterthought, and that single points of failure in deployment configuration can have outsized consequences. The combination of a misconfigured .npmignore, a publicly accessible R2 bucket, and a known unfixed bug in the acquired toolchain should not add up to a company-defining incident — but it did.
The fix is straightforward in retrospect: add *.map to your .npmignore, authenticate your R2 bucket, and audit your build pipeline for similar issues. Expect a wave of internal security audits at every company with significant npm published packages in the weeks ahead.



