High-performance CLI for building quality context windows that make AI assistants actually understand your codebase.
AI coding assistants are only as good as the context you provide. Most tools simply concatenate files, leading to:
- Irrelevant files cluttering the context window
- Missing dependencies that are crucial for understanding
- Token limits wasted on unimportant code
- No understanding of how your code actually connects
context-creator uses tree-sitter to build a dependency graph of your codebase, selecting only the files relevant to your task. It's like repomix, but faster and smarter.
# Generic context that includes everything
cat src/**/*.ts > context.txt # 500K tokens of mostly noise
# Intelligent context that follows your code's actual dependencies
context-creator --prompt "How does the authentication work?"
# Returns: auth files + their actual dependencies + related tests = 50K relevant tokens
- Dependency-aware: Uses tree-sitter AST parsing to understand imports, not just file names
- Fast: Rust-powered parallel processing handles massive codebases in seconds
- Smart selection: Includes only files connected to your query through the dependency graph
- Multi-language: Semantic analysis for Python, TypeScript, JavaScript, and Rust
- MCP integration: Works as a server for AI assistants to query your codebase programmatically
npm install -g context-creator-mcp@latest
For platform-specific MCP client setup, see Installation Guide.
# Analyze current directory
context-creator
# Build focused context for specific task
context-creator --prompt "Find security vulnerabilities in the auth system"
# Trace dependencies of specific files
context-creator --trace-imports --include "**/auth.py"
# Compare changes with dependency context
context-creator diff HEAD~1 HEAD
# Enrich code with OpenTelemetry runtime metrics
context-creator telemetry -t traces.json
Add to your MCP client configuration:
{
"mcpServers": {
"context-creator": {
"command": "npx",
"args": ["-y", "context-creator-mcp@latest"]
}
}
}
Then in your AI assistant:
"Explain how the payment system works" # AI will use analyze_local to build relevant context
"Find all SQL injection vulnerabilities" # Searches with full dependency understanding
- Tree-sitter AST parsing for true code understanding
- Import tracing and dependency resolution
- Parallel processing with Rayon
- Token budget management
- Git history integration
- MCP server with programmatic access
- OpenTelemetry integration for runtime metrics
-
analyze_local
- Analyze local codebases with dependency awareness -
analyze_remote
- Analyze Git repositories -
search
- Text pattern search -
semantic_search
- AST-based code search -
file_metadata
- File information -
diff
- File comparison
node_modules/
target/
*.log
.env
src/core/**
src/api/**
[defaults]
max_tokens = 200000
[[priorities]]
pattern = "src/core/**"
weight = 100
- Installation Guide - Detailed setup instructions
- Usage Examples - CLI commands and workflows
- Configuration - Advanced configuration
- MCP Server Guide - MCP integration details
- Architecture - Technical implementation
- Node.js >= v18.0.0 (for npm package)
- or Rust >= 1.70.0 (for building from source)
git clone https://github.com/matiasvillaverde/context-creator
cd context-creator
cargo build --release
See CONTRIBUTING.md for guidelines.
MIT