MCP Server Quickstart
Connect AI Memory to Claude Desktop, Cursor, and 113+ MCP clients in under 2 minutes. Your AI assistant gets instant access to your conversation history.
📋 Table of Contents
1. What is MCP?
The Model Context Protocol (MCP)is an open standard created by Anthropic that lets AI assistants connect to external data sources and tools. Think of it as "USB-C for AI" — a universal connector that works with any MCP-compatible AI client.
AI Memory's MCP Server implements this protocol, giving your AI assistant the ability to search, save, and manage your conversation memories. Once connected, you can ask Claude or Cursor things like:
- "Search my memory for discussions about React hooks"
- "What did I talk about with Claude regarding database optimization last week?"
- "Save this conversation summary to my memory with tag 'project-ideas'"
- "List my recent memories about machine learning"
2. Remote vs Local Mode
AI Memory supports two modes. Choose the one that fits your workflow:
| Feature | Remote Mode (Recommended) | Local Mode |
|---|---|---|
| Transport | HTTP | stdio |
| Endpoint | https://aimemory.pro/api/mcp | python3 server.py |
| Setup | Just add URL to config | Install Python + dependencies |
| Requires server running | No (hosted) | Yes (local process) |
| Offline support | No | Yes |
| Data location | aimemory.pro server | Your local machine |
| Best for | Quick setup, getting started | Privacy, offline, self-hosting |
3. Claude Desktop Setup
Claude Desktop
Anthropic's desktop app with native MCP support
Remote Mode (Recommended)
Edit your Claude Desktop config file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
Linux: ~/.config/claude-desktop/claude_desktop_config.json
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Local Mode (stdio)
If you prefer to run the server locally:
pip install fastmcp
{
"mcpServers": {
"ai-memory": {
"command": "python3",
"args": ["/path/to/mcp-server/server.py"],
"env": {
"AIMEMORY_DB": "/path/to/aimemory.db"
}
}
}
}✅ Done!Restart Claude Desktop. You should see "ai-memory" in the MCP servers list. Try asking: "Search my memory for discussions about Python"
4. Cursor Setup
Cursor IDE
AI-first code editor with MCP support
Create or edit .cursor/mcp.json in your project root (or ~/.cursor/mcp.json for global config):
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}💡 Tip: In Cursor, you can use the MCP tools in the AI chat panel. The tools appear automatically after configuration. Use @ai-memory to reference the server.
5. Windsurf Setup
Windsurf (Codeium)
AI-powered IDE with MCP integration
Open Windsurf settings and navigate to the MCP Servers section. Add a new server:
{
"mcpServers": {
"ai-memory": {
"serverUrl": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}6. VS Code Setup
VS Code
With Continue extension or MCP extension
Using the Continue extension, add to your .continue/config.json:
{
"mcpServers": [
{
"name": "ai-memory",
"serverUrl": "https://aimemory.pro/api/mcp"
}
]
}Or using the MCP for VS Code extension, add the server URL directly in the extension settings.
7. Cline Setup
Cline
Autonomous coding agent with MCP support
Open Cline settings and add the MCP server configuration:
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}8. Available Tools
The AI Memory MCP Server exposes 5 tools that your AI assistant can use:
search_memoriesFull-text search across all your saved conversations. Supports FTS5 syntax (AND, OR, phrase matching, proximity search).
Parameters: query (required), limit (default: 10)
save_memorySave a new memory or conversation snippet to your knowledge base. Supports tags and source tracking.
Parameters: content (required), tags (optional), source (optional)
list_memoriesBrowse recent memories with optional tag filtering and pagination. Returns newest first.
Parameters: limit (default: 20), tag (optional)
update_memoryUpdate an existing memory's content and/or tags.
Parameters: memory_id (required), content (optional), tags (optional)
delete_memoryPermanently delete a memory by its ID.
Parameters: memory_id (required)
9. Usage Examples
Once connected, you can naturally ask your AI assistant to use your memory. Here are some examples:
🔍 Searching memories
"Search my memory for discussions about React performance optimization"
→ Calls search_memories with query "React performance optimization"
📝 Saving a conversation
"Save this conversation summary to my memory with the tag 'project-ideas'"
→ Calls save_memory with the summary content and tag
📋 Listing recent memories
"Show me my 10 most recent memories about machine learning"
→ Calls list_memories with tag "machine-learning" and limit 10
🧠 Getting context
"Based on my previous conversations, what approach did we decide on for the database migration?"
→ Calls search_memories with "database migration" and synthesizes the answer
10. Troubleshooting
MCP server not appearing in Claude Desktop
Make sure the config file is valid JSON (no trailing commas). Restart Claude Desktop after editing the config. Check that the file path is correct for your OS.
Connection timeout
The remote endpoint at aimemory.pro/api/mcp may take a few seconds to cold-start on first request. If it persists, check your internet connection and try again.
No results from search
Make sure you've uploaded conversations to AI Memory first. The MCP server searches your uploaded data — if no conversations are imported, searches will return empty results.
Local mode: "fastmcp not found"
Install the dependency: pip install fastmcp. If using a virtual environment, make sure it's activated before running the server.
Cursor not showing MCP tools
Make sure the .cursor/mcp.json file is in your project root (not home directory for project-level config). Restart Cursor after adding the configuration.
Related Resources
Ready to give your AI a memory?
Upload your conversations first, then connect via MCP.