Repomix: Convert Your Entire Repository into Context for Any AI

Repomix: Turn Your Entire Repository into Context for Any AI

Ever tried asking Claude, ChatGPT, or Gemini to help you debug a bug that touched four different files — and ended up pasting files one by one for ten minutes before the AI could even start reasoning about the problem.

That’s exactly the problem Repomix solves: putting your entire repository into a single clean, structured file, ready to be consumed by any AI.

No copy and paste. No lost connections between files. No model losing the thread.


The problem nobody solves for you

AI code tools are very good within a single window. They’re bad at reasoning about your complete project — not for lack of ability, but for lack of visibility.

Cursor sees the files you have open. Claude.ai sees what you paste. GitHub Copilot sees the current file. None of them automatically understand how your authentication middleware connects to your API routes connects to your context providers. You have to build that picture for them.

Repomix builds it for you.


A single command

npx repomix

That’s it. Run it in your project directory and it generates a repomix-output.xml file with your entire repository in a structured format optimized for LLMs. The file starts with a model-oriented explanation, then organizes your code hierarchically — so the AI doesn’t receive a text dump, but a coherent representation of your project.

If you’re going to use it frequently, install it globally:

npm install -g repomix
# or with Homebrew (macOS/Linux)
brew install repomix

What makes it really useful

Token counting. Before you paste anything, Repomix shows you exactly how many tokens each file uses and the complete total. No more guessing whether your project will blow up Claude’s context window.

Respects your .gitignore. Automatically excludes what you’ve already told git to ignore — node_modules, build artifacts, environment files. You can add a .repomixignore for additional control.

Built-in security. Repomix integrates Secretlint to detect API keys, tokens, or hardcoded credentials before they end up in your AI context. Important when you’re about to paste a file into a third-party interface.

The --compress flag. Uses Tree-sitter to intelligently trim boilerplate and reduce token count by approximately 70%, preserving the structure that matters. For large repositories, this is the difference between fitting in context or not.

repomix --compress

Remote repositories. You can package any public GitHub repository without cloning it:

repomix --remote owner/repo

Useful for security audits, understanding third-party libraries, or passing a dependency’s source code directly to your AI for deeper analysis.


Practical workflows

Complex debugging: Paste the complete output into Claude or ChatGPT with: “Here’s my entire repository. My authentication flow fails when users log in with Google — trace the flow and identify the problem.” The AI can now see every file, every dependency, every connection.

Refactoring: “Review this project and propose a migration plan from REST to tRPC.” It has the context to give you a real answer, not a generic one.

Documentation generation: Pass your complete project and ask for an updated README, API documentation, or an architecture overview.

Code review before a PR: Run Repomix, paste into your favorite LLM, ask for a comprehensive review. Every file gets reviewed together, not in isolation.


The MCP server

For a more integrated experience, Repomix also works as an MCP server — which means you can use it directly within Claude Code, Claude Desktop, or any MCP-compatible client:

npx repomix --mcp

This allows your AI assistant to query your repository through Repomix interactively, rather than receiving a static file. More dynamic, and useful for long sessions where code changes as you work.


Why it matters that it’s model-agnostic

It’s worth saying explicitly: Repomix doesn’t care which AI you’re using. The output works with Claude, ChatGPT, Gemini, DeepSeek, Perplexity, Grok, Llama, or any local model you’re running through LM Studio or Ollama.

It also means you can compare how different models handle the same repository. Run the same prompt against Claude and Gemini and see which one gives better architectural advice for your specific stack.

Repomix was nominated in the “Powered by AI” category of the JSNation Open Source Awards 2025 — a recognition that reflects the traction it’s gained in the community.


Links