How to Set Up MCP Server: Complete Guide for AI Memory
The Model Context Protocol (MCP)is transforming how AI assistants interact with external tools and data sources. In this comprehensive guide, you'll learn exactly how to set up an MCP server — specifically AI Memory's MCP server — with every major AI client including Claude Desktop, Cursor, Windsurf, and Cline. By the end, your AI assistants will be able to search your entire conversation history across ChatGPT, Claude, DeepSeek, and Gemini.
TL;DR — Quick MCP Server Setup
- What is MCP? An open standard for connecting AI tools to external data sources
- Why it matters: Lets your AI search past conversations across all platforms
- Setup time: Under 2 minutes for any MCP client
- Best for: Developers, researchers, and anyone who uses multiple AI tools
- AI Memory URL:
https://aimemory.pro/api/mcp
What is MCP (Model Context Protocol)?
The Model Context Protocol, or MCP, is an open standard created by Anthropic that enables AI assistants to connect to external tools and data sources through a universal interface. Launched in late 2024, MCP has rapidly become the dominant standard for AI tool integration.
Think of MCP like USB for AI. Before USB, every device had its own proprietary connector. Similarly, before MCP, every AI tool integration required custom code. MCP provides a single, standardized way for any AI assistant to connect to any data source or tool.
Why MCP Matters for Your AI Workflow
As of 2026, over 113 MCP clientssupport the protocol. This means the MCP server setup you do once will work across dozens of AI tools. Here's why this is a game-changer:
- Universal compatibility — One server works with Claude Desktop, Cursor, Windsurf, Cline, and 100+ other tools
- Standardized interface — No custom integrations needed for each AI client
- Growing ecosystem — New MCP servers and clients are added weekly
- Open standard — No vendor lock-in; switch clients without losing integrations
- Bidirectional — AI tools can both read from and write to MCP servers
How MCP Works
MCP uses a client-server architecture. The MCP client (your AI tool) connects to an MCP server (a data source or tool). Communication happens over a standardized protocol that supports:
- Tools — Functions the AI can call (like searching, adding data)
- Resources — Data the AI can read
- Prompts — Pre-built prompt templates
- Sampling — Server-initiated AI completions
For AI Memory, the MCP server exposes tools that let your AI assistant search through your saved conversations, add new memories, and retrieve context — all through the standard MCP protocol.
MCP Client Comparison: Which Tool Should You Use?
Not sure which MCP client to use? Here's a comprehensive comparison of the most popular MCP-compatible tools and how they work with AI Memory's MCP server:
| MCP Client | Type | Transport | Setup Difficulty | Best For |
|---|---|---|---|---|
| Claude Desktop | Desktop App | HTTP / stdio | ⭐⭐ Easy | General AI conversations, research |
| Cursor | Code Editor | HTTP / stdio | ⭐ Easiest | Coding with AI assistance |
| Windsurf | Code Editor | HTTP / stdio | ⭐⭐ Easy | AI-powered development |
| Cline | VS Code Extension | HTTP / stdio | ⭐⭐ Easy | VS Code users, autonomous coding |
| Continue | VS Code / JetBrains | HTTP / stdio | ⭐⭐⭐ Medium | Multi-IDE support |
| Zed | Code Editor | HTTP | ⭐⭐ Easy | High-performance editing |
| Sourcegraph Cody | IDE Extension | HTTP | ⭐⭐⭐ Medium | Code intelligence, large codebases |
| OpenAI ChatGPT | Web / Desktop | HTTP | ⭐⭐ Easy | General conversations with tools |
All of these clients support the HTTP transport method, which is what AI Memory's MCP server uses. This means setup is straightforward regardless of which tool you choose. Below, we provide step-by-step instructions for each major client.
Step-by-Step: Claude Desktop MCP Server Setup
Claude Desktop has the most mature MCP support, as the protocol was created by Anthropic. Setting up model context protocol with Claude Desktop takes under 2 minutes.
Step 1: Locate Your Configuration File
Find the Claude Desktop configuration file on your system:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\\Claude\\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Step 2: Add the AI Memory MCP Server Configuration
Open the configuration file in any text editor and add the AI Memory MCP server:
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}If you already have other MCP servers configured, simply add the "ai-memory" entry inside the "mcpServers" object alongside your existing servers.
Step 3: Restart Claude Desktop
Save the configuration file and completely quit Claude Desktop (not just close the window). Reopen it and navigate to Settings → MCP Servers. You should see "AI Memory" listed with a green status indicator showing the connection is active.
Step 4: Test the Connection
Start a new conversation with Claude and try these prompts:
- “Search my AI Memory for conversations about React hooks”
- “List my recent memories from ChatGPT”
- “Get context about the database migration I discussed last week”
Claude will automatically call the appropriate MCP tools and return results from your AI Memory database.
Step-by-Step: Cursor MCP Server Setup
Cursor has excellent MCP support with the easiest setup process. If you're a developer who uses Cursor as your primary editor, connecting AI Memory gives your coding assistant access to your entire conversation history.
Step 1: Open MCP Settings
In Cursor, navigate to Settings → Features → MCP Servers. You can also access this via the command palette (Cmd+Shift+P / Ctrl+Shift+P) by searching for “MCP”.
Step 2: Add a New MCP Server
Click “Add New MCP Server” and enter the following details:
- Name: AI Memory
- Type: HTTP
- URL:
https://aimemory.pro/api/mcp
Alternatively, you can edit the .cursor/mcp.json file in your project directory:
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Step 3: Verify the Connection
Cursor will automatically verify the connection when you save. A green checkmark indicates success. You can now use AI Memory tools in Cursor's AI chat and composer.
Using AI Memory in Cursor
Once connected, you can reference past conversations in your coding workflow:
- “Search my memory for the API design discussion I had with ChatGPT”
- “Find my notes about the authentication refactor”
- “What did I discuss about WebSocket vs SSE last month?”
Step-by-Step: Windsurf MCP Server Setup
Windsurf (formerly Codeium) supports MCP servers natively. Here's how to set up the AI Memory MCP server in Windsurf:
Step 1: Open Windsurf Settings
Go to Windsurf → Settings → Cascade → MCP Servers. Click the option to add a new MCP server.
Step 2: Configure the Server
You can configure the MCP server through the UI or by editing the configuration file directly. The Windsurf MCP config file is located at:
- macOS:
~/.codeium/windsurf/mcp_config.json - Windows:
%USERPROFILE%\\.codeium\\windsurf\\mcp_config.json
Add the AI Memory server configuration:
{
"mcpServers": {
"ai-memory": {
"serverUrl": "https://aimemory.pro/api/mcp"
}
}
}Step 3: Restart and Verify
Restart Windsurf and open the Cascade panel. You should see AI Memory listed as an available MCP server. Try asking Cascade to search your memory to verify everything is working.
Step-by-Step: Cline MCP Server Setup
Cline is a popular VS Code extension for autonomous AI coding. It has full MCP support, making it easy to connect AI Memory.
Step 1: Open Cline Settings
In VS Code, open the Cline extension panel. Click the MCP Serversicon (looks like a plug) in the Cline sidebar, or use the command palette to search for “Cline: MCP Servers”.
Step 2: Edit the MCP Configuration
Click “Edit Configuration” to open the MCP settings file. Add the AI Memory server:
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Step 3: Verify and Test
Save the configuration and Cline will automatically attempt to connect. Once connected, Cline can search your AI Memory when helping you code. You can explicitly ask it to look up past discussions or let it automatically retrieve context when relevant.
Setting Up AI Memory with Other MCP Clients
The MCP server setup is similar across all clients. For any MCP-compatible tool, you need:
- Server URL:
https://aimemory.pro/api/mcp - Transport: HTTP (also called “Streamable HTTP”)
- Name: Any name you prefer (e.g., “ai-memory”)
Continue (VS Code / JetBrains)
Edit your Continue configuration file at ~/.continue/config.json and add the MCP server under the "mcpServers" key. Continue supports both HTTP and stdio transports.
Zed Editor
In Zed, open Settings and navigate to the MCP section. Add a new HTTP server with the AI Memory URL. Zed will connect automatically and make the tools available in its AI assistant panel.
Custom / Programmatic Clients
If you're building a custom MCP client, connect to https://aimemory.pro/api/mcp using the MCP SDK. The server supports the Streamable HTTP transport, which works over standard HTTP requests with optional Server-Sent Events for streaming responses.
AI Memory MCP Server: Tool Reference
AI Memory's MCP server exposes four powerful tools that give your AI assistants access to your entire conversation history. Here's a detailed look at each tool and how to use them:
1. search_memory
Full-text search across all your saved conversations. This is the most commonly used tool. It performs a semantic and keyword search across your ChatGPT, Claude, DeepSeek, and Gemini conversations.
Parameters:
query(required) — The search query textplatform(optional) — Filter by platform: “chatgpt”, “claude”, “deepseek”, “gemini”limit(optional) — Maximum number of results to return (default: 10)
Example usage in conversation:
“Search my AI Memory for conversations about Kubernetes deployment strategies”
2. add_memory
Save new conversations or notes directly from your AI assistant. This is useful for preserving important insights or decisions made during a conversation so you can reference them later.
Parameters:
content(required) — The conversation content or note to savetitle(optional) — A title for the memoryplatform(optional) — Source platform tagtags(optional) — Array of tags for categorization
Example usage in conversation:
“Save this conversation about the new API architecture to my AI Memory”
3. get_context
Retrieve relevant context snippets for a specific topic. This tool is designed for AI assistants that need background information before answering a question. It returns the most relevant excerpts from your conversation history.
Parameters:
topic(required) — The topic to find context formax_snippets(optional) — Maximum number of context snippets (default: 5)
Example usage in conversation:
“Get context about my previous discussions on microservices vs monolith”
4. list_memories
Browse your recent conversations with filtering and pagination.Use this to see what you've saved recently or to browse conversations from a specific platform.
Parameters:
platform(optional) — Filter by platformlimit(optional) — Number of results per page (default: 20)offset(optional) — Pagination offset
Example usage in conversation:
“List my 10 most recent memories from Claude”
Real-World Use Cases
Here are practical scenarios where setting up an MCP server with AI Memory transforms your workflow:
Use Case 1: Developer Searching Past Code Discussions
The Problem:
Sarah is a full-stack developer who uses ChatGPT for architecture discussions and Cursor for coding. Three months ago, she had a detailed conversation with ChatGPT about implementing OAuth2 with PKCE flow. Now she's implementing it in Cursor and can't remember the specifics.
With MCP Server:
Sarah simply asks Cursor: “Search my AI Memory for the OAuth2 PKCE implementation discussion I had with ChatGPT.” Cursor finds the conversation, extracts the relevant code snippets and design decisions, and incorporates them directly into the current coding session. No context switching, no re-explaining, no lost time.
Use Case 2: Researcher Finding Cross-Platform Insights
The Problem:
Dr. Chen is a researcher who discusses papers across multiple AI tools — ChatGPT for brainstorming, Claude for detailed analysis, and DeepSeek for code generation. His insights are scattered across all three platforms.
With MCP Server:
Dr. Chen connects AI Memory's MCP server to Claude Desktop. Now he can ask: “Search all my memories for discussions about transformer attention mechanisms.” Claude finds relevant discussions from ChatGPT, Claude, and DeepSeek — all in one search. The AI can synthesize insights across all his past conversations, giving him a comprehensive view of his research discussions.
Use Case 3: Team Knowledge Sharing
The Problem:
A development team has multiple members using different AI tools. Knowledge is siloed — the frontend developer's ChatGPT discussions about component architecture are inaccessible to the backend developer using Claude.
With MCP Server:
Team members upload their conversations to a shared AI Memory instance. When any team member connects via MCP, they can search across the entire team's knowledge base. A simple query like“What has the team discussed about the payment processing system?”returns insights from everyone's conversations, breaking down knowledge silos.
Use Case 4: Personal AI Knowledge Base
The Problem:
Alex uses AI assistants daily for learning new technologies, debugging, and brainstorming. Over time, hundreds of valuable conversations are buried in different platforms with no way to search across them.
With MCP Server:
Alex exports all conversations into AI Memory and connects via MCP to Claude Desktop. Now conversations become a searchable knowledge base. When starting a new project, Alex asks:“What have I learned about Docker networking in past conversations?” The AI retrieves relevant discussions, helping Alex build on past knowledge instead of starting from scratch.
MCP Server Setup Troubleshooting
Having trouble with your model context protocol setup? Here are common issues and solutions:
Server Not Showing Up
- Verify the URL is exactly
https://aimemory.pro/api/mcp - Ensure the transport is set to “http” (or “Streamable HTTP”)
- Restart your MCP client after changing the configuration
- Check that your configuration file is valid JSON
Connection Errors
- Check your internet connection
- Verify there are no firewall rules blocking outbound HTTPS connections
- Try accessing
https://aimemory.proin your browser to confirm the site is reachable - Check your AI Memory account status and API key if authentication is required
Tools Not Working
- Make sure you have uploaded conversations to AI Memory before searching
- Try a simple search query first to test the connection
- Check the MCP server logs in your client for detailed error messages
- Verify your AI Memory API key has the necessary permissions
Performance Tips
- Use specific search queries rather than very broad terms
- Filter by platform when you know which source you're looking for
- Use the
limitparameter to control result count - Keep your AI Memory database organized with tags for faster retrieval
Privacy and Security
Security is a top priority for AI Memory's MCP server implementation. Here's how your data is protected:
- Local storage by default — Your conversation data is stored locally. The MCP server reads from your local database, not a remote server.
- API key authentication — All MCP requests are authenticated using your personal API key. No one else can access your data.
- Self-hosting option — For maximum control, you can self-host the AI Memory MCP server on your own infrastructure.
- No third-party sharing — Your conversation data is never shared with third parties or used for model training.
- Encrypted transport — All MCP communication happens over HTTPS, ensuring your data is encrypted in transit.
Getting Started with AI Memory MCP Server
Ready to give your AI assistants a memory upgrade? Here's how to get started in under 5 minutes:
- Upload your conversations — Visit aimemory.pro and upload your ChatGPT, Claude, DeepSeek, or Gemini conversation exports. AI Memory supports all major export formats.
- Connect via MCP — Follow the setup guide above for your preferred AI tool (Claude Desktop, Cursor, Windsurf, Cline, or any other MCP client).
- Start searching— Ask your AI assistant to search your memory. Try queries like “What have I discussed about [topic]?” or “Find my notes about [project].”
- Save new insights — Use the
add_memorytool to save important conversations or notes directly from your AI assistant for future reference.
Pro Tip: Enable Auto-Context
Some MCP clients support automatic tool invocation, where the AI proactively searches your memory without being asked. This means your AI assistant automatically pulls relevant past discussions into new conversations — creating a truly persistent memory across all your AI interactions.
Frequently Asked Questions
What is the difference between MCP and API integration?
While traditional API integrations require custom code for each tool, MCP provides a universal standard. One MCP server works with any MCP client — no custom integration code needed. This means AI Memory's MCP server works out of the box with 113+ clients.
Do I need a paid AI Memory account to use the MCP server?
AI Memory offers a free tier that includes MCP server access. For larger conversation libraries and advanced features, paid plans are available. Visit our pricing page for details.
Can I use AI Memory MCP server with multiple clients simultaneously?
Yes! You can connect AI Memory's MCP server to Claude Desktop, Cursor, Windsurf, and Cline all at the same time. Each client maintains its own connection and can independently search and add memories.
How fast is the search through MCP?
Search speed depends on the size of your conversation library. For most users with hundreds of conversations, searches return results in under 1 second. The MCP server uses optimized indexing for fast full-text and semantic search.
Does the MCP server work offline?
If you self-host the AI Memory MCP server locally, it can work without an internet connection. The hosted version at aimemory.pro requires an internet connection but provides a hassle-free experience with no setup required.
For the full API reference and additional setup guides, see the MCP documentation page. For a comparison of AI Memory with other memory management tools, check out our AI Memory Management Tools 2026 guide.