Search
⌘K
Overview
SDKs
Available SDKs
AI INTEGRATION
MCP Server
APIs
Authentication
Swap API
Cross-Chain Swaps (Fusion+)
Intent Swap (Fusion)
Classic Swap
Swap RWAs
Orderbook API
History API
Traces API
Portfolio API
Balance API
Gas Price API
Spot Price API
Token API
NFT API
Transaction Gateway API
Charts API
Domains API
Token Details API
Web3 RPC API
Introduction
Arbitrum
Avalanche
Base
BNB Chain
Ethereum
Gnosis
Linea
Monad
Optimism
Polygon
Solana
Sonic
Unichain
ZKsync
Resolvers
Introduction
Terms of Use Resolver Terms
Limit Order
Quick Start
Contract interaction
Resolver farming guide
Unicorn Power setup
Intent/Fusion
Evm quicklinks
Solana quicklinks
Cross Chain
Cross chain examples
Exclusive Resolver API Terms
FAQ
API Error Messages
API Troubleshooting
Cors errors
Tx input data
Infrastructure fees
Docs·AI INTEGRATION·MCP Server

MCP Server

The 1inch MCP Server connects your AI coding assistant to the entire 1inch developer ecosystem documentation, API references, and production ready SDK examples through the Model Context Protocol (MCP).

Instead of switching between browser tabs, searching docs, and copy-pasting code snippets, you describe what you're building and your AI assistant pulls the right information directly into your workflow.

The server is hosted at https://api.1inch.com/mcp/protocol

What you can do

Once connected, you work through natural conversation. Your AI assistant calls the right tools behind the scenes.

Search the entire 1inch documentation

Ask any question about 1inch APIs, supported chains, authentication, error codes, or integration patterns. Your AI searches across all documentation, API references, and SDK guides to find the answer — no manual browsing required.

Browse available SDK examples

Discover production-ready code examples for common integration scenarios. Your AI lists what's available so you can pick the example closest to your use case before diving into code.

Get implementation code

Pull complete, working code directly into your conversation — in TypeScript, Go or Python. Examples cover provider setup, token approvals, swap execution, error handling, and environment configuration across EVM chains and Solana. Your AI can then adapt the code to your specific project and language.

The examples are provided in TypeScript — but your AI assistant can translate them into any language. Just ask for "the swap example in Java" or "convert this to Rust" and it will adapt the code for you.

Example prompts

The following examples show the kinds of questions and requests that work well with the 1inch MCP Server. Use them as starting points and adapt to your specific use case.

Starting a new integration

"I'm building a swap feature on Base. Show me a complete TypeScript example for swapping ERC-20 tokens using the 1inch API."

"I need to integrate 1inch swaps into my Go backend service. Show me how to use the Go SDK."

"How do I call the 1inch swap API from Python?"

The assistant retrieves the relevant SDK example with full source code — including wallet setup, token approval, swap execution, and error handling — in the language you need, ready for you to adapt.

Understanding APIs and parameters

"What authentication method does the 1inch API use? Show me the header format."

"Which endpoint should I use for cross-chain swaps — classic swap or Fusion+? What's the difference?"

"What parameters does the swap endpoint accept, and which ones are required?"

"What chains and chain IDs does 1inch support?"

The assistant searches the API reference and returns the specific details you need, with links to the full documentation for deeper reading.

Getting code for specific scenarios

"Show me how to swap native ETH to USDC on Base, including gas estimation."

"Get the Solana swap implementation with proper error handling."

"How do I check and set token allowance before executing a swap in Go?"

"Show me how to set up a viem provider configured for 1inch on Arbitrum."

"I'm writing a Solidity contract that interacts with the 1inch aggregation router. What's the interface I need?"

"Give me a Python script that fetches a swap quote from the 1inch API."

The assistant retrieves the matching documentation and example files in your language, and highlights the relevant sections for your scenario.

Working with intent-based and cross-chain swaps

"Walk me through the Fusion intent swap lifecycle — from order creation to execution."

"Show me a working Fusion+ cross-chain swap example between Ethereum and Base."

"How does the orderbook API work for limit orders?"

The assistant combines documentation search with code examples to give you both the conceptual understanding and the implementation.

Debugging and troubleshooting

"My swap transaction is returning a 400 error. What are the common causes?"

"I'm getting CORS errors when calling the 1inch API from my frontend. How do I fix this?"

"The approval transaction succeeded but the swap still fails with 'insufficient allowance'. What should I check?"

The assistant searches the FAQ and API documentation for known issues and solutions specific to your error.

Prompting tips

To get the most out of the MCP Server:

  • Start with what you're building, not what you're looking for. Instead of "show me the swap endpoint docs", try "I need to add a token swap feature to my React app on Base". The assistant will find the right combination of docs and code examples.
  • Ask for examples first. When you need working code, ask for an SDK example before asking for documentation. The examples are tested, complete implementations that give the AI better context to help you.
  • Mention your chain and language. Saying "TypeScript on Base", "Go on Ethereum", "Python swap quote", or "Solidity interface for the aggregation router" helps the assistant pick the most relevant example immediately.
  • Iterate on the code. Once the assistant retrieves an example, ask follow-up questions: "Now adapt this for USDT instead of USDC" or "Add retry logic for failed transactions."
  • Ask about trade-offs. Questions like "Should I use classic swap or Fusion for my use case?" get you a tailored recommendation backed by the documentation.
  • Combine topics in one conversation. The assistant maintains context, so you can start with "show me the swap example", then follow up with "now add token approval" and "what error codes should I handle?" — all in the same session.

Supported clients

Client Setup
Claude Desktop Instructions
Claude Code Instructions
OpenAI Codex Instructions
Gemini CLI Instructions
Cursor Instructions
Windsurf Instructions
VS Code (Copilot) Instructions
JetBrains IDEs Instructions

The following clients are also MCP-compatible and can connect using the generic setup below:

  • Continue — open-source AI assistant for VS Code and JetBrains
  • Zed — AI-native code editor
  • Amazon Q Developer — AWS AI coding assistant

MCP is an open standard with rapidly growing adoption. Any MCP-compatible client can connect to the 1inch MCP Server.

Setup

Claude Code

Claude Code supports HTTP MCP servers natively:

Bash
1
2
3
4
5
claude mcp add \
  --transport http \
  --scope user \
  1inch-business \
  https://api.1inch.com/mcp/protocol

OpenAI Codex

Codex supports MCP servers in both the CLI and the VS Code extension:

Bash
1
codex --mcp-server "https://api.1inch.com/mcp/protocol"

For the VS Code extension, add the server in the Codex MCP settings panel with the same URL.

Gemini CLI

Gemini CLI supports HTTP MCP servers natively:

Bash
1
2
3
4
gemini mcp add \
  --transport http \
  1inch-business \
  https://api.1inch.com/mcp/protocol

Claude Desktop

Claude Desktop communicates via stdio, so you use supergateway to bridge to the HTTP server. This requires Node.js 18+.

Add the following to your configuration file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
JSON
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
{
  "mcpServers": {
    "1inch-business": {
      "command": "/absolute/path/to/node/bin/npx",
      "args": [
        "-y",
        "supergateway",
        "--streamableHttp",
        "https://api.1inch.com/mcp/protocol",
        "--outputTransport",
        "stdio"
      ],
      "env": {
        "PATH": "/absolute/path/to/node/bin:/usr/bin:/bin"
      }
    }
  }
}

Replace /absolute/path/to/node/bin with the path to your Node.js installation (e.g., /usr/local/bin or the output of dirname $(which node)).

The command and PATH fields require absolute paths because AI clients do not inherit your shell's PATH. Using npx -y ensures the latest version of supergateway is downloaded automatically.

Cursor

Create or edit .cursor/mcp.json in your project root:

JSON
1
2
3
4
5
6
7
{
  "mcpServers": {
    "1inch-business": {
      "url": "https://api.1inch.com/mcp/protocol"
    }
  }
}

Windsurf

Create or edit ~/.codeium/windsurf/mcp_config.json:

JSON
1
2
3
4
5
6
7
{
  "mcpServers": {
    "1inch-business": {
      "serverUrl": "https://api.1inch.com/mcp/protocol"
    }
  }
}

VS Code (Copilot)

Create or edit .vscode/mcp.json in your project root:

JSON
1
2
3
4
5
6
7
8
{
  "servers": {
    "1inch-business": {
      "type": "http",
      "url": "https://api.1inch.com/mcp/protocol"
    }
  }
}

VS Code also supports configuring MCP servers through settings.json under the "mcp" key. See the VS Code MCP documentation for additional options.

JetBrains IDEs

IntelliJ IDEA, WebStorm, PyCharm, and other JetBrains IDEs support MCP through the AI Assistant (2025.1+).

  1. Open SettingsToolsAI AssistantModel Context Protocol (MCP)
  2. Click Add, select HTTP, and paste the following JSON snippet:
JSON
1
2
3
4
5
6
7
{
  "mcpServers": {
    "1inch-business": {
      "url": "https://api.1inch.com/mcp/protocol"
    }
  }
}

The 1inch MCP tools will be available in the AI Assistant chat within your IDE.

Other MCP clients

Most MCP clients support HTTP transport and can connect directly to the server URL:

  • Server URL: https://api.1inch.com/mcp/protocol

For clients that only support stdio transport, use supergateway to bridge to the HTTP server (requires Node.js 18+):

Bash
1
2
3
npx -y supergateway \
  --streamableHttp https://api.1inch.com/mcp/protocol \
  --outputTransport stdio

Configure your client to launch this as a subprocess and communicate over stdin/stdout.

Troubleshooting

Issue Solution
Connection refused Verify the command path is absolute. Check that https://api.1inch.com/mcp/protocol is reachable from your network
Server not appearing Restart your AI client after modifying the configuration file. Check that the JSON is valid (no trailing commas)
npx not found Ensure Node.js 18+ is installed. Use an absolute path to npx in the command field (e.g., /usr/local/bin/npx)
Too many sessions (429) Each AI client window opens a separate session. Close windows you're not using, or wait 15 minutes for idle sessions to expire. This limit is specific to MCP sessions and does not affect your API usage
Tools not loading Check the AI client's MCP logs for connection errors. In Cursor, use Developer: Open MCP Log from the command palette
Stale results Documentation indexes are updated periodically. If recently published content is missing, try again in a few minutes

Use of the 1inch MCP Server is subject to the 1inch Business Portal Terms of Service and the applicable Software Legal Notice. Users are responsible for ensuring compliance with applicable regulations.

Did you find what you need?