How to Use MCP with Cursor: Complete Setup Guide (2026)
Want to connect external tools and data sources to Cursor IDE? The Model Context Protocol (MCP)makes it possible. In this comprehensive guide, you'll learn exactly how to set up MCP with Cursor — including configuring AI Memory so Cursor can search your entire conversation history across ChatGPT, Claude, DeepSeek, and Gemini while you code.
TL;DR — Quick Cursor MCP Setup
- What you need: Cursor IDE installed (v0.45+)
- Config file:
.cursor/mcp.jsonin your project root - Setup time: Under 2 minutes
- AI Memory URL:
https://aimemory.pro/api/mcp - Result: Cursor can search all your past AI conversations
What is Cursor IDE?
Cursor is an AI-powered code editor built on top of VS Code that has taken the developer world by storm. It integrates large language models directly into your coding workflow, offering AI chat, intelligent code generation, multi-file editing, and — critically — full MCP (Model Context Protocol) support.
Unlike traditional code editors with bolt-on AI extensions, Cursor was designed from the ground up as an AI-first development environment. It understands your entire codebase, can refactor across multiple files, and now supports MCP servers that extend its capabilities even further.
Key Features of Cursor
- MCP Server Support — Connect to external tools and data sources via the Model Context Protocol
- AI Chat — Context-aware conversations about your code with Claude, GPT-4, and more
- Code Generation — Natural language to code with full project context
- Multi-File Editing — AI-powered refactoring across your entire codebase
- VS Code Compatible — Supports all VS Code extensions, themes, and keybindings
- Tab Autocomplete — Intelligent inline code suggestions as you type
- Terminal Integration — AI assistance directly in your terminal
Cursor vs VS Code vs Other AI Editors
| Feature | Cursor | VS Code | Windsurf |
|---|---|---|---|
| MCP Support | ✅ Full support | ⚠️ Via extensions only | ✅ Full support |
| Built-in AI Chat | ✅ Native | ⚠️ Copilot Chat | ✅ Native |
| VS Code Extensions | âś… Full compatibility | âś… Native | âś… Full compatibility |
| Multi-Model Support | ✅ Claude, GPT, Gemini | ⚠️ Limited | ✅ Multiple models |
| Price | Free + Pro $20/mo | Free | Free + Pro $15/mo |
What is MCP (Model Context Protocol)?
The Model Context Protocol (MCP) is an open standard created by Anthropic that enables AI assistants and coding tools to connect to external tools and data sources through a universal interface. Think of it as USB for AI — before USB, every device had its own proprietary connector. Similarly, before MCP, every AI tool integration required custom code.
How MCP Works in Cursor
In the MCP architecture, Cursor acts as the MCP client, and external services (like AI Memory, GitHub, databases, or APIs) act as MCP servers. Communication happens over a standardized protocol that supports:
- Tools — Functions Cursor can call (e.g., search_memory, query_database)
- Resources — Data Cursor can read from the server
- Prompts — Pre-built prompt templates for common development tasks
- Sampling — Server-initiated AI completions for advanced workflows
Why MCP Matters for Developers Using Cursor
MCP transforms Cursor from a smart code editor into a fully connected development environment. Instead of being limited to your local codebase, Cursor can now:
- Search your past conversations across ChatGPT, Claude, DeepSeek, and more
- Access external databases and APIs without leaving the editor
- Interact with CI/CD pipelines and deployment tools
- Read documentation from external knowledge bases
- Connect to project management tools like Jira, Linear, and GitHub Issues
As of 2026, over 113 MCP-compatible clients and servers are available in the ecosystem, with Cursor being one of the most popular developer-focused clients.
Prerequisites for Cursor MCP Setup
Before you begin setting up mcp server cursor integration, make sure you have:
- Cursor IDE installed — Download from cursor.sh. Version 0.45 or later is required for MCP support.
- A Cursor account — Free tier works for basic MCP; Pro plan recommended for heavy usage and access to premium models.
- A project directory — MCP configuration is project-scoped, so you need an open project in Cursor.
- MCP server URL(s) — The URL(s) of the MCP server(s) you want to connect (e.g.,
https://aimemory.pro/api/mcp).
⚠️ Important: Cursor Version Required
MCP support was introduced in Cursor v0.45. Make sure your Cursor is up to date by going to Help → About Cursorand checking the version number. If you're on an older version, update Cursor before proceeding.
Step-by-Step: Cursor MCP Server Setup
Follow these steps to configure any MCP server in Cursor. The cursor mcp setup process takes under 2 minutes.
Step 1: Create the .cursor Directory and mcp.json File
Cursor uses a .cursor/mcp.jsonfile in your project root for MCP configuration. Create this file if it doesn't exist:
# Navigate to your project root cd /path/to/your/project # Create the .cursor directory if it doesn't exist mkdir -p .cursor # Create the mcp.json file touch .cursor/mcp.json
You can also create this file through Cursor's settings UI by navigating to Cursor Settings → MCP Servers → Add New MCP Server.
Step 2: Add MCP Server Configuration
Open .cursor/mcp.jsonand add your MCP server configuration. Here's the basic structure:
{
"mcpServers": {
"server-name": {
"url": "https://your-mcp-server-url.com/mcp",
"transport": "http"
}
}
}For local stdio-based servers, the configuration looks different:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
}
}
}Step 3: Configure AI Memory MCP Server (Recommended)
The AI Memory MCP server is one of the most powerful additions to your cursor mcp server setup. It lets Cursor search through all your saved AI conversations from ChatGPT, Claude, DeepSeek, Gemini, and other platforms.
Add this to your .cursor/mcp.json:
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Step 4: Save and Verify the Connection
After saving the .cursor/mcp.json file, Cursor will automatically detect the configuration and attempt to connect to the MCP server. You can verify the connection by:
- Opening Cursor Settings → MCP Servers
- Looking for a green status indicator next to your server
- Opening the AI chat (Cmd+L or Ctrl+L) and testing a tool call
If the server shows a green status, start a new conversation in Cursor's AI chat and test it. For AI Memory, try typing: “Search my memory for conversations about React hooks”.
Step 5: Start Using MCP Tools in Cursor
Once connected, Cursor's AI assistant can automatically invoke MCP tools during conversations. You don't need to memorize tool names — simply describe what you want in natural language, and Cursor's AI will determine which MCP tool to call.
- “Search my AI Memory for the Docker networking issue I debugged last week”
- “Find context about the API authentication flow I discussed with Claude”
- “Save this code review discussion to my memory”
- “List my recent ChatGPT conversations about TypeScript”
Complete Cursor mcp.json Configuration Examples
Here are comprehensive mcp.json configurations for different use cases. Copy and customize these for your cursor mcp setup.
Basic: AI Memory Only
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Multi-Server: AI Memory + GitHub + Filesystem
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token_here"
}
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/Users/you/projects"
]
}
}
}Full Stack: AI Memory + Postgres + Slack
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
},
"postgres": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-postgres",
"postgresql://localhost:5432/mydb"
]
},
"slack": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-slack"],
"env": {
"SLACK_BOT_TOKEN": "xoxb-your-token",
"SLACK_TEAM_ID": "T01234567"
}
}
}
}Understanding Transport Types
| Transport | Config Fields | Best For | Example |
|---|---|---|---|
| HTTP | url, transport | Cloud services, remote APIs | AI Memory, remote databases |
| stdio | command, args, env | Local tools, file system access | Filesystem, GitHub, Postgres |
AI Memory MCP Server: Deep Dive
AI Memory is one of the most powerful MCP servers you can connect to Cursor. It gives your AI coding assistant the ability to search through all your saved conversations from ChatGPT, Claude, DeepSeek, Gemini, and other AI platforms.
What AI Memory Provides via MCP
Once connected to Cursor, AI Memory exposes four powerful tools:
| Tool | Description | Example Use |
|---|---|---|
search_memory | Full-text search across all saved conversations | “Find my discussion about Docker networking” |
add_memory | Save new conversations or notes | “Save this debugging session about CORS” |
get_context | Retrieve relevant context snippets for a topic | “Get context about the database migration” |
list_memories | Browse recent conversations with filtering | “Show my latest ChatGPT conversations about APIs” |
Why Developers Love AI Memory + Cursor
As a developer, your AI conversations contain valuable context — architecture decisions, debugging solutions, code patterns, and design discussions. Without AI Memory, this context is locked inside individual chat platforms. With AI Memory connected to Cursor via MCP:
- Never re-explain your architecture — Cursor can search your past AI conversations for the full context of your project decisions.
- Debug faster— Find solutions you've already discovered in ChatGPT or Claude without leaving your editor.
- Cross-platform knowledge — Access conversations from ChatGPT, Claude, DeepSeek, and Gemini all from one place.
- Contextual code generation— Cursor's AI can reference your past discussions to generate code that aligns with your existing patterns.
Testing the AI Memory Connection in Cursor
After adding AI Memory to your .cursor/mcp.json, open Cursor's AI chat (Cmd+L on macOS, Ctrl+L on Windows/Linux) and try these prompts:
- “Search my AI Memory for conversations about database optimization”
- “What did I discuss about this project's architecture last month?”
- “Find my notes about the authentication flow implementation”
- “Save this conversation about API rate limiting to my memory”
Adding Multiple MCP Servers to Cursor
Cursor supports multiple MCP servers simultaneously. Simply add each server as a separate entry in the "mcpServers" object. This lets you build a comprehensive development environment powered by multiple data sources and tools.
Each server runs independently, so if one server goes down, the others continue to work. Cursor's AI will intelligently route requests to the appropriate server based on context.
Recommended MCP Server Stack for Developers
- AI Memory — Search all your past AI conversations
- GitHub — Create issues, manage PRs, search repositories
- Filesystem — Enhanced file operations beyond Cursor's built-in support
- Database (Postgres/MySQL) — Query and explore your databases directly
- Slack/Discord — Stay connected with your team without leaving the editor
Troubleshooting: Cursor MCP Issues
Encountering problems with your cursor mcp setup? Here are the most common mcp cursor issues and how to fix them.
Problem: MCP Server Not Appearing in Cursor
Symptoms: The server doesn't show up in Cursor Settings → MCP Servers.
Solutions:
- Verify the
.cursor/mcp.jsonfile exists in your project root (not a subdirectory) - Validate the JSON syntax — use jsonlint.com to check
- Ensure
"mcpServers"is at the top level of the JSON object - Try closing and reopening your project in Cursor
- Check that you're running Cursor v0.45 or later
Problem: Server Shows Red / Connection Failed
Symptoms: The server appears but shows a red or error status.
Solutions:
- Verify the server URL is correct and accessible from your browser
- Check your internet connection and firewall settings
- For HTTP servers, ensure the URL includes
https:// - Try reloading the Cursor window (Cmd+Shift+P → “Reload Window”)
- Check if the MCP server itself is experiencing downtime
Problem: JSON Parse Error
Symptoms: Cursor fails to load MCP configuration or shows a parse error.
Solutions:
- Check for trailing commas (not allowed in JSON)
- Ensure all strings use double quotes, not single quotes
- Verify matching curly braces
{ }and square brackets[ ] - Use Cursor's built-in JSON validation (it highlights errors automatically)
Problem: Tools Not Working in AI Chat
Symptoms: Green status but Cursor's AI says it can't use the tools.
Solutions:
- Start a new conversation after connecting the server
- Ensure you're using an MCP-capable model (Claude 3.5 Sonnet or later recommended)
- Check if the server requires authentication (API key, OAuth, etc.)
- Verify the server is exposing tools correctly (check server documentation)
- Make sure the MCP server toggle is enabled in Cursor Settings → MCP Servers
Problem: Performance Issues / Slow Responses
Symptoms: Cursor is slow to respond when MCP tools are active.
Solutions:
- Reduce the number of active MCP servers if you have many configured
- Check your network latency to remote MCP servers
- For local stdio servers, ensure your system has enough resources
- Disable MCP servers you're not actively using via the toggle in settings
Pro Tips for Cursor MCP Setup
Get the most out of your mcp server cursor configuration with these expert tips:
1. Use Project-Specific Configurations
Since Cursor's MCP config is project-scoped (in .cursor/mcp.json), you can have different MCP servers for different projects. A frontend project might connect to a design system server, while a backend project connects to a database server.
2. Add .cursor/mcp.json to .gitignore for Sensitive Data
If your mcp.json contains API keys or tokens, add it to your .gitignore file to avoid committing sensitive credentials:
# .gitignore .cursor/mcp.json
3. Start with AI Memory as Your First MCP Server
If you're new to MCP, start with AI Memory as your first and only server. It requires no API keys or local setup — just add the URL and start searching your conversations. Once comfortable, add more servers.
4. Name Servers Descriptively
Use clear, descriptive names in your config. Instead of "server1", use "ai-memory" or "github-tools". This helps Cursor's AI understand what each server does and when to use it.
5. Combine MCP with Cursor's Built-in Features
MCP tools complement Cursor's existing capabilities. Use Cmd+K for inline code generation, Cmd+L for AI chat with MCP tools, and Cmd+I for the Composer multi-file editing mode — all enhanced by your connected MCP servers.
6. Leverage AI Memory for Onboarding
When joining a new project or team, search your AI Memory for past discussions about the codebase. This is faster than reading documentation and gives you context that docs often miss.
Security Considerations for Cursor MCP
When setting up MCP servers in Cursor, keep these security best practices in mind:
- Only connect trusted servers — MCP servers can execute tools and access data. Only connect to servers from reputable sources.
- Use HTTPS URLs — Always use
https://for remote MCP servers to ensure encrypted communication. - Protect API keys — Store sensitive tokens in environment variables rather than directly in
mcp.json, or add the file to.gitignore. - Review tool permissions— Understand what each MCP server's tools can do before enabling them, especially write-capable tools.
- Keep Cursor updated — Regularly update to the latest version for security patches and MCP improvements.
đź”’ AI Memory Security
AI Memory's MCP server is designed with privacy as a priority. All data is stored securely, the MCP endpoint only accesses your own conversations, authentication is handled via API keys, and there is no third-party data sharing. You can also self-host for complete control.
Frequently Asked Questions: MCP with Cursor
Is Cursor free?
Yes, Cursor offers a free tier with limited AI completions and basic MCP support. The Pro plan ($20/month) provides unlimited completions, access to premium models, and enhanced MCP capabilities.
Can I use MCP servers in the Cursor free plan?
Yes, MCP server support is available on all Cursor plans, including free. However, the free plan has limited AI model access and usage quotas that may affect how frequently you can use MCP tools.
How many MCP servers can I add to Cursor?
There's no hard limit on the number of MCP servers you can configure in Cursor. However, adding too many may impact performance. We recommend 3-5 servers for the optimal balance of capability and speed.
Do I need to restart Cursor after changing mcp.json?
Cursor automatically detects changes to .cursor/mcp.jsonin most cases. If changes aren't picked up, use Cmd+Shift+P (or Ctrl+Shift+P) and run “Reload Window” to force a refresh.
Can I use both local and remote MCP servers in Cursor?
Absolutely! Cursor supports both HTTP-based remote MCP servers (like AI Memory) and stdio-based local servers (like filesystem or GitHub) in the same configuration. Mix and match as needed.
Does Cursor MCP work with all AI models?
MCP tool calling works best with Claude 3.5 Sonnet and later models. GPT-4 and Gemini models also support MCP tools in Cursor, but Claude models tend to have the most reliable MCP integration due to Anthropic's direct involvement in the protocol.
Get Started with Cursor MCP Today
Setting up MCP with Cursor takes under 2 minutes and transforms how you develop software. Start by connecting the AI Memory MCP server to give Cursor access to your entire conversation history across all AI platforms.
Ready to supercharge your cursor mcp setup? Sign up for AI Memory and add the MCP server URL https://aimemory.pro/api/mcp to your Cursor configuration today. Never lose context between AI conversations again.
For more MCP setup guides, check out our Claude Desktop MCP setup guide and complete MCP server setup guide covering Claude Desktop, Cursor, Windsurf, Cline, and more.