A Model Context Protocol (MCP) server that provides AI assistants with direct access to Surfboard Payments' documentation.
This MCP server acts as a bridge between AI assistants and Surfboard Payments' documentation, enabling AI systems to access current, accurate documentation on-demand. This ensures developers receive precise guidance when integrating with Surfboard Payments APIs.
- Documentation Fetching: Retrieve raw documentation files (Markdown, JSON) from Surfboard's servers
-
Documentation Index: Access the central documentation index (
llms.txt
) - Basic Search: Find relevant documentation paths based on simple keyword queries
- Documentation Caching: Store documentation for 3-4 days, reducing network overhead and improving response times
- Multi-runtime Support: Run using Node.js, Bun
npm install @surfai/docs-mcp
npm install -g @surfai/docs-mcp
npm run start:node
npm run start:bun
# Run directly from anywhere
node-surfboard-documentation-ai
OR
# Run from anywhere
bun-surfboard-documentation-ai
The MCP server provides the following tools:
Fetches the Surfboard documentation index.
Parameters:
-
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_fetch_documentation_index",
"arguments": {
"bypass_cache": false
}
}
Fetches a specific Surfboard documentation file.
Parameters:
-
path
(required): Path to the documentation file -
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_fetch_documentation",
"arguments": {
"path": "/api/orders.json",
"bypass_cache": false
}
}
Searches Surfboard documentation.
Parameters:
-
query
(required): Search query -
max_results
(optional): Maximum number of results to return (default: 10) -
include_snippets
(optional): Whether to include snippets in the results (default: true) -
bypass_cache
(optional): Whether to bypass the cache (default: false)
Example:
{
"name": "surfboard_search_documentation",
"arguments": {
"query": "process payment",
"max_results": 5,
"include_snippets": true,
"bypass_cache": false
}
}
Clears the documentation cache.
Parameters:
-
path
(optional): Path to clear from cache (if not provided, clears the entire cache)
Example:
{
"name": "surfboard_clear_cache",
"arguments": {
"path": "/api/orders.json"
}
}
This MCP server is designed to be used with AI assistants that support the Model Context Protocol. It provides tools for accessing Surfboard Payments documentation directly from the AI assistant.
When registering this MCP server with your AI assistant platform and using a local installation:
{
"command": "node",
"args": ["node_modules/@surfai/docs-mcp/node.index.js"],
"env": {}
}
Or for Bun:
{
"command": "bun",
"args": ["node_modules/@surfai/docs-mcp/bun.index.js"],
"env": {}
}
When registering this MCP server with your AI assistant platform and using a global installation:
{
"command": "node-surfboard-documentation-ai",
"args": [],
"env": {},
"autoApprove": []
}
Or for Bun:
{
"command": "bun-surfboard-documentation-ai",
"args": [],
"env": {},
"autoApprove": []
}
If you're having troubles using the above command, you can try running the following command instead:
~ npm root -g
/Users/{{USER}}/.nvm/versions/node/{{VERSION}}/lib/node_modules # The path to the global node_modules directory
Then, you can use the mcp server in the following way:
{
"command": "node",
"args": [
"/Users/{{USER}}/.nvm/versions/node/{{VERSION}}/lib/node_modules/@surfai/docs-mcp/node.index.js"
],
"env": {},
"disabled": false,
"autoApprove": []
}