pip install aimemory-mcp-server

Give your AI
persistent memory.

An open-source MCP server that connects Claude Desktop, Cursor, ChatGPT, and 113+ AI clients to your entire conversation history. One install. Works everywhere.

113+
MCP Clients
5
Tools Included
10s
Install Time
100%
Free & Open Source

Install in 10 seconds

One command. No configuration files. No accounts.

$pip install aimemory-mcp-server
$aimemory-mcp-server

✓ Python 3.8+   ✓ Works on macOS, Linux, Windows   ✓ No API keys needed

5 powerful memory tools

Everything your AI assistant needs to remember, search, and manage your conversations.

🔍

search_memories

Full-text search across all your saved conversations with FTS5 syntax support. Filter by platform, date, or tags.

💾

save_memory

Save new conversations, insights, or memory snippets with automatic tagging and source attribution.

📋

list_memories

Browse your memory library with tag filtering, date ranges, and pagination. See what your AI remembers.

✏️

update_memory

Edit existing memories — update content, add tags, correct details. Keep your knowledge base accurate.

🗑️

delete_memory

Remove outdated or irrelevant memories permanently. Full control over your data.

🧠

get_context

Retrieve the most relevant context from past conversations for any topic. Perfect for continuing old threads.

Works with your favorite AI tools

One server, every client. Same config everywhere.

🟣

Claude Desktop

Most Popular

Add to claude_desktop_config.json:

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}

Restart Claude Desktop. Ask: "Search my memory for React performance tips"

Cursor

Go to Settings → MCP → Add New MCP Server:

Name:AI Memory
Type:stdio
Command:aimemory-mcp-server
🌊

Windsurf

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}
💙

VS Code + Cline / Continue

In your MCP settings, add:

{
  "mcpServers": {
    "ai-memory": {
      "command": "aimemory-mcp-server"
    }
  }
}

Also works with: Zed, Continue, Augment, Aider, and 100+ more MCP clients

View full documentation →

Two ways to use it

Choose local or cloud — both give your AI persistent memory.

Recommended

🏠 Local Server

Runs entirely on your machine. Your data never leaves your computer.

100% offline — no internet required
SQLite database on your disk
stdio transport — zero config
Full CRUD: search, save, update, delete
pip install aimemory-mcp-server
Cloud

☁️ Hosted Server

Use our hosted endpoint. Upload conversations via the web UI, then search from any MCP client.

No installation needed
Access from any device
HTTP transport (Streamable)
Session-based security
Endpoint: aimemory.pro/api/mcp

Why MCP changes everything

The Model Context Protocol is the USB-C of AI — one standard, infinite connections.

🔌

Universal Standard

One server works with Claude, Cursor, ChatGPT, and 113+ clients. No custom integrations per platform.

🔗

Cross-Platform Memory

Your AI remembers conversations from ChatGPT, Claude, DeepSeek, and Gemini — all searchable from one place.

🚀

First-Mover Advantage

Cross-platform memory format is an industry gap. AI Memory is building the "SMTP of AI memory" — and you can use it today.

How AI Memory MCP compares

The only MCP server with cross-platform conversation search.

FeatureAI Memory MCPMem0Custom MCP
Setup Time10 seconds30+ minutesHours
Cross-Platform Search✅ 4 platforms❌ Single source⚠️ Manual
MCP Standard✅ Native❌ REST API only✅ DIY
Offline Support✅ Full❌ Cloud only⚠️ Depends
PricingFreeFree tier + $24M VCDev time
Consumer-Friendly✅ Zero config❌ Developer-only❌ Code required

Frequently asked questions

What is an MCP server for AI memory?

An MCP (Model Context Protocol) server for AI memory is a tool that gives AI assistants like Claude Desktop and Cursor persistent memory by connecting them to your conversation history. AI Memory's MCP server lets any MCP-compatible AI search, save, and retrieve memories across all your AI conversations.

How do I install the AI Memory MCP server?

Install with one command: pip install aimemory-mcp-server. Then add the configuration to your MCP client (Claude Desktop, Cursor, etc.) and restart. The server runs locally on your machine with full offline support.

Which AI tools support MCP servers?

Over 113 AI clients support MCP servers, including Claude Desktop, Cursor, Windsurf, VS Code (with Cline/Continue), Zed, and many more. The Model Context Protocol is an open standard supported by the entire AI ecosystem.

Is the AI Memory MCP server free?

Yes, the AI Memory MCP server is completely free and open-source. You can install it via pip, run it locally, and connect it to any MCP client at no cost. There are no usage limits or hidden fees.

Does the MCP server work offline?

Yes, the standalone MCP server runs entirely on your local machine. Your conversation data stays in a local SQLite database — no cloud connection required. You can also use the hosted version at aimemory.pro/api/mcp for cloud-based access.

What is the difference between AI Memory MCP and Mem0?

AI Memory MCP server is a free, open-source tool focused on managing and searching existing AI conversations. Mem0 is a B2B API platform ($24M funded) for building custom memory layers in applications. AI Memory is consumer-friendly with zero setup, while Mem0 requires developer integration.

Give your AI persistent memory today

One pip install. Works with Claude, Cursor, ChatGPT, and 113+ tools.