AI Memory MCP Server
Connect your AI assistants to your conversation history via the Model Context Protocol. Let Claude, Cursor, and other AI tools search your past conversations.
What is MCP?
The Model Context Protocol (MCP)is an open standard that lets AI assistants connect to external data sources and tools. With AI Memory's MCP Server, your AI assistant can search through your entire conversation history — across ChatGPT, Claude, DeepSeek, and Gemini.
🔍 search_memory
Full-text search across all your saved conversations with platform filtering.
📝 add_memory
Save new conversations or memory snippets to your knowledge base.
🧠 get_context
Retrieve relevant context from past conversations for a given topic.
📋 list_memories
Browse recent conversations with optional platform filtering and pagination.
Setup Guide: Claude Desktop
Add the following to your Claude Desktop MCP configuration file:
Config file location:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"ai-memory": {
"url": "https://aimemory.pro/api/mcp",
"transport": "http"
}
}
}Restart Claude Desktop after saving the config. You can now ask Claude to search your conversation history!
Setup Guide: Cursor
In Cursor, go to Settings → MCP → Add New MCP Server and configure:
AI MemoryHTTPhttps://aimemory.pro/api/mcpOnce connected, Cursor can search your AI conversation history when writing code or answering questions.
Other MCP Clients
AI Memory works with any MCP-compatible client. Use these connection details:
Endpoint: https://aimemory.pro/api/mcp
Protocol: MCP 2024-11-05 (JSON-RPC 2.0)
Transport: Streamable HTTP (POST)
Authentication: None required (public endpoint)
Compatible with: Windsurf, Cline, Continue, Zed, and 100+ other MCP clients.
API Reference
Initialize Connection
POST /api/mcp
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": 1,
"method": "initialize"
}Search Conversations
POST /api/mcp
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": 2,
"method": "tools/call",
"params": {
"name": "search_memory",
"arguments": {
"query": "machine learning best practices",
"platform": "chatgpt",
"limit": 5
}
}
}Add Memory
POST /api/mcp
Content-Type: application/json
{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "add_memory",
"arguments": {
"title": "Key insights on RAG architecture",
"content": "Today I learned that...",
"platform": "manual",
"tags": ["rag", "architecture", "insights"]
}
}
}Frequently Asked Questions
Is the MCP Server free?
Yes, the MCP Server is available on the free plan. All conversation search and retrieval features are free forever.
Is my data sent to a cloud server?
The MCP endpoint at aimemory.pro accesses conversations stored in your account. If you use the web upload feature, your data is stored server-side in an encrypted database. You can also run AI Memory locally for 100% offline usage.
Can I self-host the MCP Server?
Yes! AI Memory is open source. Clone the repository and run it locally. The MCP Server endpoint is available at /api/mcp on any deployment.
Which platforms are supported?
AI Memory supports ChatGPT, Claude, DeepSeek, and Gemini. You can import conversations from all platforms and search them through the MCP interface.
Ready to connect your AI to your memory?
Upload your conversations first, then connect via MCP.