- Installation
- IDE Setup
- Configuration
- Available Tools
- Usage
- Troubleshooting
- Developing This MCP Server
- Support
- License
- Terms of Use
A Model Context Protocol (MCP) server that provides AI-powered access to Chainlink documentation, focusing on CCIP (Cross-Chain Interoperability Protocol) and other Chainlink products.
- Node.js 18+
- npm or pnpm
-
If you want to use this MCP server across multiple projects (globally) then the install path on macOS is
$HOME/.cursor/mcp.json
and on Windows is%USERPROFILE%\.cursor\mcp.json
. If you want it only inside a given project then create./cursor/mcp.json
in your project root. -
Add the following configuration:
{
"mcpServers": {
"chainlink-mcp-server": {
"command": "npx",
"args": ["-y", "@chainlink/mcp-server"],
"env": {
"MCP_ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here"
}
}
}
}
Add to your Claude Desktop configuration:
{
"mcpServers": {
"chainlink-mcp-server": {
"command": "npx",
"args": ["-y", "@chainlink/mcp-server"],
"env": {
"MCP_ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here"
}
}
}
}
Based on the AI LLM service you want to use, set one of the following environment variables (either in the terminal session or in the MCP Server config JSON)
# Anthropic Claude (default)
export MCP_ANTHROPIC_API_KEY="sk-your-anthropic-api-key"
# OpenAI GPT
export MCP_OPENAI_API_KEY="your-openai-api-key"
# Google Gemini
export MCP_GEMINI_API_KEY="your-google-api-key"
# Ollama (run locally, free)
export MCP_AI_SERVICE=ollama
export OLLAMA_MODEL=llama3.2:3b
For privacy and no API costs, you can run everything locally, though depending on your machine it could increase latency:
-
Install Ollama:
# macOS brew install ollama # Linux curl -fsSL https://ollama.ai/install.sh | sh
-
Pull a model:
ollama pull llama3.2:3b
-
Start Ollama:
ollama serve
-
Configure for local use:
export MCP_AI_SERVICE=ollama export OLLAMA_MODEL=llama3.2:3b
Comprehensive Chainlink developer assistant that handles any type of developer query spanning code examples, configurations, and concepts. Provides access to:
- Fetched API Data: CCIP chain configurations and supported tokens (fallback data source)
- Documentation Search: Semantic search across Chainlink documentation
- Code Examples: Smart Solidity contract examples and implementation patterns
- Configuration Help: Network configurations, contract addresses, and deployment guidance
- Best Practices: Security recommendations and development patterns
Example Queries:
- "How do I send a cross-chain message using CCIP?"
- "What are the supported chains for CCIP?"
- "Show me a complete CCIP contract example"
- "Help me configure CCIP token transfers"
Once configured in your IDE, you can ask questions about Chainlink development directly in your chat interface. The MCP server will automatically provide relevant documentation, code examples, and configuration data.
Make sure your mcp.json
file is in the correct location and properly formatted.
Verify your API key is correctly set:
- For Anthropic: Key should start with
sk-
- For OpenAI: Key should start with
sk-
- For Gemini: Key should be a valid Google API key
If using Ollama:
# Check if Ollama is running
ollama list
# Start Ollama if needed
ollama serve
# Test the model
ollama run llama3.2:3b "Hello"
See DEVELOPMENT.md for development setup
- Issues: Report bugs on GitHub Issues
- Discussions: Ask questions in GitHub Discussions
MIT License - see LICENSE for details.
The Chainlink MCP Server (npm package @chainlink/mcp-server) is in the “Early Access” stage of development, which means that it currently has functionality which is under development and may be changed in later versions. There is no guarantee any of the contemplated features of the MCP Server will be implemented as specified. The MCP Server is provided on an “AS IS” and “AS AVAILABLE” basis without any representations, warranties, covenants, or conditions of any kind. Use at your own risk. Users remain fully responsible for reviewing, auditing, and deploying any code or contracts. Do not use the code in this example in a production environment without completing your own audits and application of best practices, including compliance with applicable licenses governing any code used. Neither Chainlink Labs, the Chainlink Foundation, nor Chainlink node operators are responsible for unintended outputs that are generated due to errors in code. Please review the Chainlink Terms of Service which provides important information and disclosures. By using the MCP Server, you expressly acknowledge and agree to accept these terms.