Looking for a complete MCP server installation guide? This tutorial walks you through every step — from checking prerequisites to configuring AI Memory's MCP server with Claude Desktop, Cursor, VS Code, and Windsurf. By the end, your AI assistants will be able to search, add, and retrieve context from all your saved conversations through the Model Context Protocol.
TL;DR — Quick MCP Server Installation
- Prerequisites: Python 3.10+, pip installed
- Install:
pip install aimemory-mcp-server - Run:
aimemory-mcp-server - Configure your AI client (Claude Desktop, Cursor, VS Code, Windsurf)
- Done! Your AI can now search your conversation history
Table of Contents
Prerequisites
Before you begin this MCP server installation guide, make sure you have the following ready:
System Requirements
- ✓Python 3.10 or higher — The server uses modern Python features introduced in 3.10
- ✓pip — Python's package manager (comes bundled with Python)
- ✓An MCP-compatible client — Claude Desktop, Cursor, VS Code, Windsurf, etc.
- ✓Operating System — macOS, Windows 10+, or Linux
Check Your Python Version
Open your terminal and run the following command to verify your Python version:
python --versionYou should see Python 3.10.x or higher. If you have Python 3.9 or lower, upgrade first. On macOS with Homebrew: brew install python@3.12. On Ubuntu/Debian: sudo apt install python3.12. On Windows, download the latest installer from python.org.
Step 1 — Install aimemory-mcp-server
With Python ready, install the AI Memory MCP server package from PyPI with a single command:
pip install aimemory-mcp-serverThis downloads and installs the aimemory-mcp-server package along with all its dependencies. The package provides:
- An MCP server exposing tools:
search_memory,add_memory,get_context, andlist_memories - SQLite-based local storage for conversations
- Stdio transport (default) and HTTP/SSE transport for remote access
- Cross-platform support (macOS, Windows, Linux)
Verify the Installation
After installation, verify the server is available:
aimemory-mcp-server --helpYou should see usage information printed to the terminal. If you get a "command not found" error, ensure your Python scripts directory is in your system PATH. On most systems this happens automatically.
Optional: Install in a Virtual Environment
To keep your system Python clean, you can create a virtual environment first:
python -m venv ~/.aimemory-venv
source ~/.aimemory-venv/bin/activate # macOS/Linux
# or: ~/.aimemory-venv/Scripts/activate # Windows
pip install aimemory-mcp-serverIf you use a venv, note the full path to the aimemory-mcp-serverbinary inside the venv's bin/directory — you'll need it when configuring your MCP client.
Step 2 — Configure Claude Desktop
Claude Desktop has native MCP support. To add the AI Memory MCP server:
- Locate your Claude Desktop configuration file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
- macOS:
- Open the file in a text editor and add the MCP server configuration:
{
"mcpServers": {
"ai-memory": {
"command": "aimemory-mcp-server",
"args": []
}
}
}- Save the file and restart Claude Desktop completely.
- You should see the AI Memory MCP server listed in Claude's MCP servers section.
💡 Tip:If you installed in a virtual environment, replace "aimemory-mcp-server" with the full path to the binary, e.g., /Users/you/.aimemory-venv/bin/aimemory-mcp-server.
Step 3 — Configure Cursor
Cursor supports MCP servers through its settings. Here's how to set it up:
- Open Cursor and go to Settings → Cursor Settings → MCP
- Click "Add new global MCP server"
- Paste the following configuration:
{
"mcpServers": {
"ai-memory": {
"command": "aimemory-mcp-server",
"args": []
}
}
}Alternatively, you can edit the MCP configuration file directly at ~/.cursor/mcp.json (global) or .cursor/mcp.json (per-project). After saving, Cursor will automatically detect and connect to the MCP server.
Step 4 — Configure VS Code (with Continue or Copilot)
VS Code supports MCP through extensions like Continueand GitHub Copilot. To configure AI Memory's MCP server in VS Code:
- Install the Continue extension from the VS Code marketplace (or use Copilot Chat if you have MCP support enabled)
- Open your MCP configuration. For Continue, edit
~/.continue/config.jsonor use the Continue settings UI - Add the MCP server entry:
{
"mcpServers": [
{
"name": "ai-memory",
"command": "aimemory-mcp-server",
"args": []
}
]
}For GitHub Copilot, add the MCP server to your workspace .vscode/settings.json:
{
"github.copilot.chat.mcp.servers": {
"ai-memory": {
"command": "aimemory-mcp-server",
"args": []
}
}
}Restart VS Code after saving. The AI Memory tools will be available in your AI chat sessions.
Step 5 — Configure Windsurf
Windsurf (by Codeium) supports MCP servers natively. To add AI Memory:
- Open Windsurf and go to Settings → Cascade → MCP Servers
- Click "Add Server"
- Enter the following configuration:
{
"mcpServers": {
"ai-memory": {
"command": "aimemory-mcp-server",
"args": []
}
}
}You can also manually edit the Windsurf MCP configuration file at ~/.windsurf/mcp.json. After saving, restart Windsurf and the server will be available in Cascade sessions.
HTTP/SSE Transport for Remote Access
By default, the MCP server uses stdio transport — it runs as a local subprocess of your AI client. This is the simplest and most secure option for individual use. However, if you need to share the MCP server across a team, run it on a remote machine, or access it from multiple devices, you can switch to HTTP/SSE transport.
Starting the Server in HTTP Mode
Set the AIMEMORY_TRANSPORT environment variable to http and start the server:
AIMEMORY_TRANSPORT=http AIMEMORY_PORT=8080 aimemory-mcp-serverThe server will start listening on http://localhost:8080. You can change the port with the AIMEMORY_PORT variable.
Connecting Remote Clients
Remote MCP clients can connect via the HTTP Streamable or SSE endpoint. In your client configuration, use the url field instead of command:
{
"mcpServers": {
"ai-memory-remote": {
"url": "http://your-server:8080/mcp"
}
}
}⚠️ Security Note: When exposing the MCP server over HTTP, make sure to run it behind a reverse proxy with TLS (HTTPS) and authentication. Never expose the server directly to the public internet without encryption.
Environment Variables Reference
The AI Memory MCP server is configured through environment variables. Here is the complete reference:
| Variable | Default | Description |
|---|---|---|
| AIMEMORY_DB | ~/.aimemory/aimemory.db | Path to the SQLite database file. Set a custom path to control where conversation data is stored. |
| AIMEMORY_TRANSPORT | stdio | Transport mode. Use stdio for local subprocess mode or http for HTTP/SSE remote access. |
| AIMEMORY_PORT | 8080 | Port for HTTP transport. Only used when AIMEMORY_TRANSPORT=http. |
Examples
Custom database location:
AIMEMORY_DB=/data/aimemory/mydb.db aimemory-mcp-serverRun as HTTP server on a custom port:
AIMEMORY_TRANSPORT=http AIMEMORY_PORT=9090 aimemory-mcp-serverYou can also set these variables in a .env file or export them in your shell profile (~/.bashrc, ~/.zshrc, etc.) for persistent configuration.
Troubleshooting Common Issues
Encountering problems? Here are the most common issues and their solutions when following this MCP server installation guide:
❌ "aimemory-mcp-server: command not found"
This means the installed binary is not in your system PATH. Solutions:
- Check if pip's bin directory is in your PATH:
echo $PATH - Reinstall with
pip install --user aimemory-mcp-serverand add~/.local/binto PATH - If using a venv, use the full path:
~/.aimemory-venv/bin/aimemory-mcp-server - On Windows, check
%APPDATA%\Python\Python3x\Scripts\
❌ MCP server not appearing in Claude Desktop
Verify your claude_desktop_config.json is valid JSON (use jsonlint.com). Make sure you completely quit and restarted Claude Desktop — just closing the window may not be enough. Check Claude's developer console for error messages (Help → Toggle Developer Tools).
❌ "ModuleNotFoundError" or dependency errors
Upgrade pip and try again: pip install --upgrade pip then pip install --upgrade aimemory-mcp-server. If conflicts persist, use a virtual environment to isolate the installation.
❌ "Connection refused" in HTTP mode
Make sure the server is running and the port is correct. Check for firewall rules blocking the port. Verify with curl http://localhost:8080/mcp. If running on a remote server, ensure the port is open in your security group / firewall.
❌ Database locked errors
SQLite only allows one writer at a time. If multiple MCP server instances try to write to the same database simultaneously, you'll see lock errors. Solution: use a single server instance in HTTP mode and connect all clients to it remotely.
❌ JSON configuration syntax error
JSON is strict — trailing commas, missing quotes, or wrong brackets will break the config. Always validate your JSON before saving. Common pitfalls: using single quotes instead of double quotes, adding a trailing comma after the last item, or forgetting to escape backslashes in Windows paths.
Frequently Asked Questions
What Python version do I need for MCP server installation?
You need Python 3.10 or higher. The server uses modern Python features like match-case statements and type union syntax that were introduced in Python 3.10. Check your version with python --version.
How do I install the AI Memory MCP server?
Run pip install aimemory-mcp-server. This installs the server package and all dependencies. After installation, start it with aimemory-mcp-server and configure it in your MCP client of choice.
Can I use the MCP server with multiple AI clients simultaneously?
Yes! In stdio mode, each client launches its own server subprocess. In HTTP mode, a single server instance can serve multiple remote clients at the same time — ideal for teams.
Is the MCP server data stored locally?
Yes. By default, all conversation data is stored in a local SQLite database at ~/.aimemory/aimemory.db. You can customize the location with the AIMEMORY_DB environment variable. No data is sent to external servers.
Can I use the MCP server remotely via HTTP or SSE?
Absolutely. Set AIMEMORY_TRANSPORT=http to enable HTTP/SSE transport. The server will listen on the port specified by AIMEMORY_PORT (default: 8080). Remote clients connect via {"url": "http://your-server:8080/mcp"}} in their configuration.
What tools does the AI Memory MCP server expose?
Four core tools: search_memory for full-text search across saved conversations, add_memory for saving new conversations or notes, get_context for retrieving relevant context snippets for a topic, and list_memories for browsing recent conversations with platform filtering and pagination.
Why is my MCP server not showing up in Claude Desktop or Cursor?
Common reasons: 1) The configuration JSON has a syntax error — validate it with a JSON linter. 2) You did not restart the client after editing the config. 3) The binary is not in your system PATH. 4) Python version is below 3.10. 5) On Windows, use forward slashes or escaped backslashes in file paths. Check the client's MCP logs for detailed error messages.
Next Steps
Congratulations — your MCP server installation is complete! Here's what to do next:
- Import existing conversations — Use the AI Memory browser extension to export your ChatGPT, Claude, and DeepSeek conversations, then import them into the MCP server database
- Try searching — Ask your AI assistant to "search my memory" for a topic you've discussed before
- Set up remote access — Switch to HTTP transport if you want to share the server across devices or team members
- Explore more MCP servers — Browse the growing ecosystem of MCP servers for databases, APIs, file systems, and more