- Notifications
You must be signed in to change notification settings - Fork16
CTX: a tool that solves the context management gap when working with LLMs like ChatGPT or Claude. It helps developers organize and automatically collect information from their codebase into structured documents that can be easily shared with AI assistants.
License
context-hub/generator
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Create LLM-ready contexts in minutes
During development, your codebase constantly evolves. Files are added, modified, and removed. Each time you need tocontinue working with an LLM, you need to regenerate context to provide updated information about your current codebasestate.
CTX is a context management tool that gives developers full control over what AI sees from their codebase. Insteadof letting AI tools guess what's relevant, you define exactly what context to provide - making your AI-assisteddevelopment more predictable, secure, and efficient.
It helps developers organize contexts and automatically collect information from their codebase into structureddocuments that can be easily shared with LLM.
For example, a developer describes what context they need:
# context.yamldocuments: -description:User Authentication SystemoutputPath:auth.mdsources: -type:filedescription:Authentication ControllerssourcePaths: -src/AuthfilePattern:"*.php" -type:filedescription:Authentication ModelssourcePaths: -src/ModelsfilePattern:"*User*.php" -description:Another DocumentoutputPath:another-document.mdsources: -type:filesourcePaths: -src/SomeModule
This configuration will gather all PHP files from thesrc/Auth directory and any PHP files containing "User" intheir name from thesrc/Models directory into a single context file.context/auth.md. This file can then be pastedinto a chat session or provided via the built-inMCP server.
Current AI coding tools automatically scan your entire codebase, which creates several issues:
- Security risk: Your sensitive files (env vars, tokens, private code) get uploaded to cloud services
- Context dilution: AI gets overwhelmed with irrelevant code, reducing output quality
- No control: You can't influence what the AI considers when generating responses
- Expensive: Premium tools charge based on how much they scan, not how much you actually need
You know your code better than any AI. CTX puts you in control:
- ✅ Define exactly what context to share - no more, no less
- ✅ Keep sensitive data local - works with local LLMs or carefully curated cloud contexts
- ✅ Generate reusable, shareable contexts - commit configurations to your repo
- ✅ Improve code architecture - designing for AI context windows naturally leads to better modular code
- ✅ Works with any LLM - Claude, ChatGPT, local models, or future tools
Download and install the tool using our installation script:
curl -sSL https://raw.githubusercontent.com/context-hub/generator/main/download-latest.sh| shpowershell -c"& ([ScriptBlock]::Create((irm 'https://raw.githubusercontent.com/context-hub/generator/main/download-latest.ps1'))) -AddToPath"This installs thectx command to your system (typically in/usr/local/bin).
Want more options? See the completeInstallation Guide foralternative installation methods.
- Initialize your project:
cd your-projectctx initThis generates acontext.yaml file with a basic configuration and shows your project structure, helping you understandwhat contexts might be useful.
Check theCommand Reference for all availablecommands and options.
- Create your first context:
ctx generate
- Use with your favorite AI:
- Copy the generated markdown files to your AI chat
- Or use the built-in MCP server with your MCP client (e.g., Claude Desktop, Cursor, Continue, Windsurf)
- Or process locally with open-source models
# Quick project overview for new developersdocuments: -description:"Project Architecture Overview"outputPath:"docs/architecture.md"sources: -type:treesourcePaths:[ "src" ]maxDepth:2 -type:filedescription:"Core interfaces and main classes"sourcePaths:[ "src" ]filePattern:"*Interface.php"
# Context for developing a new featuredocuments: -description:"User Authentication System"outputPath:"contexts/auth-context.md"sources: -type:filesourcePaths:[ "src/Auth", "src/Models" ]filePattern:"*.php" -type:git_diffdescription:"Recent auth changes"commit:"last-week"
# Generate API documentationdocuments: -description:"API Documentation"outputPath:"docs/api.md"sources: -type:filesourcePaths:[ "src/Controllers" ]modifiers:[ "php-signature" ]contains:[ "@Route", "@Api" ]
- Define exactly which files, directories, or code patterns to include
- Filter by content, file patterns, date ranges, or size
- Apply modifiers to extract only relevant parts (e.g., function signatures)
- Local-first: Generate contexts locally, choose what to share
- No automatic uploads: Unlike tools that scan everything, you control what gets sent
- Works with local models: Use completely offline with Ollama, LM Studio, etc.
- Context configurations are part of your project
- Team members get the same contexts
- Evolve contexts as your codebase changes
- Include git diffs to show recent changes
- Fast: Generate contexts in seconds, not minutes of manual copying
- Flexible: Works with any AI tool or local model
- Shareable: Commit configurations, share with team
- Extensible: Plugin system for custom sources and modifiers
CTX follows a simple pipeline:
Configuration → Sources → Filters → Modifiers → Output- Sources: Where to get content (files, GitHub, git diffs, URLs, etc.)
- Filters: What to include/exclude (patterns, content, dates, sizes)
- Modifiers: How to transform content (extract signatures, remove comments)
- Output: Structured markdown ready for AI consumption
For a more seamless experience, you can connect CTX to any MCP-compatible client using the built-in MCP server.
# Interactive setup: detect OS and install config for your clientctx mcp:configThis command:
- 🔍 Auto-detects your OS (Windows, Linux, macOS, WSL)
- 🧩 Lets you choose your MCP client (e.g., Claude Desktop, Cursor, Continue, Windsurf)
- 🎯 Generates and optionally installs the correct config for your environment
- 📋 Provides copy‑paste ready JSON if you prefer manual setup
- 🧭 Includes setup instructions and troubleshooting tips
Global Registry Mode (recommended for multiple projects/clients):
{"mcpServers": {"ctx": {"command":"ctx","args": ["server" ] } }}If you prefer manual setup, point your MCP client to the CTX server:
{"mcpServers": {"ctx": {"command":"ctx","args": ["server","-c","/path/to/project" ] } }}Note: Read more about theMCP server for detailed setup instructions and troubleshooting. Specific config file locations vary by client.
Now you can use your preferred MCP client (including Claude Desktop) to ask questions about your codebase without manually uploading context files.
Define project-specific commands that can be executed through the MCP interface:
tools: -id:run-testsdescription:"Run project tests with coverage"type:runcommands: -cmd:npmargs:[ "test", "--coverage" ]
For complete documentation, including all available features and configuration options, please visit:
Join hundreds of developers using CTX for professional AI-assisted coding:
What you'll find in our Discord:
- 💡 Share and discover context configurations
- 🛠️ Get help with setup and advanced usage
- 🚀 Showcase your AI development workflows
- 🤝 Connect with like-minded developers
- 📢 First to know about new releases and features
This project is licensed under the MIT License.
About
CTX: a tool that solves the context management gap when working with LLMs like ChatGPT or Claude. It helps developers organize and automatically collect information from their codebase into structured documents that can be easily shared with AI assistants.
Topics
Resources
License
Code of conduct
Contributing
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Contributors10
Uh oh!
There was an error while loading.Please reload this page.
