@chainlink/mcp-server

0.0.1-alpha.4 • Public • Published

Table of Contents

Chainlink MCP Server

A Model Context Protocol (MCP) server that provides AI-powered access to Chainlink documentation, focusing on CCIP (Cross-Chain Interoperability Protocol) and other Chainlink products.

Installation

Prerequisites

  • Node.js 18+
  • npm or pnpm

IDE Setup

Cursor IDE

  1. If you want to use this MCP server across multiple projects (globally) then the install path on macOS is $HOME/.cursor/mcp.json and on Windows is %USERPROFILE%\.cursor\mcp.json. If you want it only inside a given project then create ./cursor/mcp.json in your project root.

  2. Add the following configuration:

{
  "mcpServers": {
    "chainlink-mcp-server": {
      "command": "npx",
      "args": ["-y", "@chainlink/mcp-server"],
      "env": {
        "MCP_ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here"
      }
    }
  }
}

Claude Desktop

Add to your Claude Desktop configuration:

{
  "mcpServers": {
    "chainlink-mcp-server": {
      "command": "npx",
      "args": ["-y", "@chainlink/mcp-server"],
      "env": {
        "MCP_ANTHROPIC_API_KEY": "sk-your-anthropic-api-key-here"
      }
    }
  }
}

Configuration

Required: AI Service

Based on the AI LLM service you want to use, set one of the following environment variables (either in the terminal session or in the MCP Server config JSON)

# Anthropic Claude (default)
export MCP_ANTHROPIC_API_KEY="sk-your-anthropic-api-key"

# OpenAI GPT
export MCP_OPENAI_API_KEY="your-openai-api-key"

# Google Gemini
export MCP_GEMINI_API_KEY="your-google-api-key"

# Ollama (run locally, free)
export MCP_AI_SERVICE=ollama
export OLLAMA_MODEL=llama3.2:3b

Optional: Local Ollama Setup

For privacy and no API costs, you can run everything locally, though depending on your machine it could increase latency:

  1. Install Ollama:

    # macOS
    brew install ollama
    
    # Linux
    curl -fsSL https://ollama.ai/install.sh | sh
  2. Pull a model:

    ollama pull llama3.2:3b
  3. Start Ollama:

    ollama serve
  4. Configure for local use:

    export MCP_AI_SERVICE=ollama
    export OLLAMA_MODEL=llama3.2:3b

Available Tools

chainlink_developer_assistant

Comprehensive Chainlink developer assistant that handles any type of developer query spanning code examples, configurations, and concepts. Provides access to:

  • Fetched API Data: CCIP chain configurations and supported tokens (fallback data source)
  • Documentation Search: Semantic search across Chainlink documentation
  • Code Examples: Smart Solidity contract examples and implementation patterns
  • Configuration Help: Network configurations, contract addresses, and deployment guidance
  • Best Practices: Security recommendations and development patterns

Example Queries:

  • "How do I send a cross-chain message using CCIP?"
  • "What are the supported chains for CCIP?"
  • "Show me a complete CCIP contract example"
  • "Help me configure CCIP token transfers"

Usage

Once configured in your IDE, you can ask questions about Chainlink development directly in your chat interface. The MCP server will automatically provide relevant documentation, code examples, and configuration data.

Troubleshooting

"No MCP servers configured" Error

Make sure your mcp.json file is in the correct location and properly formatted.

Authentication Errors

Verify your API key is correctly set:

  • For Anthropic: Key should start with sk-
  • For OpenAI: Key should start with sk-
  • For Gemini: Key should be a valid Google API key

Ollama Connection Issues

If using Ollama:

# Check if Ollama is running
ollama list

# Start Ollama if needed
ollama serve

# Test the model
ollama run llama3.2:3b "Hello"

Developing This MCP Server

See DEVELOPMENT.md for development setup

Support

License

MIT License - see LICENSE for details.

Disclaimer

The Chainlink MCP Server (npm package @chainlink/mcp-server) is in the “Early Access” stage of development, which means that it currently has functionality which is under development and may be changed in later versions. There is no guarantee any of the contemplated features of the MCP Server will be implemented as specified. The MCP Server is provided on an “AS IS” and “AS AVAILABLE” basis without any representations, warranties, covenants, or conditions of any kind. Use at your own risk. Users remain fully responsible for reviewing, auditing, and deploying any code or contracts. Do not use the code in this example in a production environment without completing your own audits and application of best practices, including compliance with applicable licenses governing any code used. Neither Chainlink Labs, the Chainlink Foundation, nor Chainlink node operators are responsible for unintended outputs that are generated due to errors in code. Please review the Chainlink Terms of Service which provides important information and disclosures. By using the MCP Server, you expressly acknowledge and agree to accept these terms.

Package Sidebar

Install

npm i @chainlink/mcp-server

Weekly Downloads

16

Version

0.0.1-alpha.4

License

MIT

Unpacked Size

12.8 MB

Total Files

49

Last publish

Collaborators

  • secure.ericz
  • secure.javier
  • secure.andrew
  • notoriousenigma
  • secure_handerson
  • npmserviceaccount-cll
  • m4us
  • secure.thanh