Windsurf IDE (formerly Codeium) has become a favorite among developers for its fast, AI-powered code completion. But one thing Windsurf lacks out of the box is persistent memory— the ability to remember context, preferences, and past solutions across conversations. In this guide, we'll show you how to add memory to Windsurf using the Model Context Protocol (MCP).

🎯 What you'll learn:

  • How to install the AI Memory MCP server
  • How to configure MCP in Windsurf IDE
  • How to use memory tools in your coding workflow
  • Tips for organizing your AI memories

Why Windsurf Needs Memory

If you've used Windsurf for more than a few days, you've probably experienced this frustration: You explain your tech stack, coding preferences, and project architecture to Windsurf. The AI gives great help. But next session? It's all forgotten. You start from zero again.

This isn't Windsurf's fault — most AI coding assistants don't have cross-conversation memory. They treat each session as a blank slate. But with MCP, you can change that.

What is MCP?

The Model Context Protocol (MCP) is an open standard that lets AI assistants connect to external tools and data sources. Think of it as a universal plugin system — one protocol that works with Claude Desktop, Cursor, Windsurf, and 113+ other AI clients.

MCP servers provide tools that AI assistants can call. For memory, we use the AI Memory MCP server — a free, open-source server that provides 7 memory tools including search, save, update, and delete.

Step 1: Install the AI Memory MCP Server

First, install the AI Memory MCP server on your machine. You'll need Python 3.10 or higher.

# Install via pip

pip install aimemory-mcp-server

This installs the aimemory-mcp-servercommand, which you'll use in Windsurf's MCP configuration.

Verify the installation:

aimemory-mcp-server --help

Step 2: Configure MCP in Windsurf

Now add the AI Memory server to Windsurf's MCP configuration:

  1. Open Windsurf IDE
  2. Go to Settings (gear icon or Cmd/Ctrl + ,)
  3. Navigate to MCP or Model Context Protocol
  4. Click Add New MCP Server

Configure the server with these settings:

Name: AI Memory

Command: aimemory-mcp-server

Transport: stdio

Env (optional):

{
  "AIMEMORY_DB": "/path/to/your/aimemory.db"
}

The AIMEMORY_DB environment variable lets you specify where memories are stored. If omitted, it defaults to ~/.aimemory/memories.db.

Step 3: Restart Windsurf

After saving the MCP configuration, restart Windsurf to load the new server. You should see "AI Memory" in your MCP servers list.

To verify the connection, open Windsurf's MCP panel and check that AI Memory shows Connected status with 7 tools available.

Step 4: Using Memory in Windsurf

Now you can use memory tools directly in your Windsurf conversations. Here are some examples:

Save Coding Preferences

"Save this to memory: I prefer TypeScript with strict mode enabled, and I use Tailwind CSS for all styling. My testing framework is Vitest."

Windsurf will call the save_memory tool and store these preferences. Next time you start a new session, you can say:

"Search my memory for coding preferences"

Remember Project Context

When you've explained your project architecture, save it:

"Save memory: This project uses Next.js 15 with App Router, Prisma ORM with PostgreSQL, and authentication via Clerk. The main entities are User, Project, and Task."

Find Past Solutions

When you encounter a problem you've solved before:

"Search my memories for PostgreSQL connection pool issues"

7 Memory Tools Available

The AI Memory MCP server provides 7 tools that Windsurf can use:

ToolPurpose
search_memoriesFull-text search with FTS5 syntax
save_memoryStore new memories with tags
list_memoriesBrowse memories with filtering
get_memoryRetrieve a specific memory by ID
update_memoryEdit existing memory content
delete_memoryRemove outdated memories
memory_statsView total count and tag distribution

Best Practices for Windsurf Memory

✅ Do:

  • Use tags when saving memories (e.g., "typescript", "preferences", "project-context")
  • Be specific when searching — "React performance optimization" works better than "react"
  • Save important decisions and why you made them
  • Update memories when your preferences or tech stack changes

❌ Don't:

  • Save sensitive data like API keys or passwords
  • Save every single conversation — focus on reusable insights
  • Forget to clean up outdated memories periodically

Import Existing AI Conversations

Already have conversations in ChatGPT, Claude, or other AI tools? You can import them into AI Memory:

  1. Export conversations from ChatGPT (Settings → Data Controls → Export)
  2. Upload to aimemory.pro
  3. The web tool parses and indexes all your conversations
  4. Windsurf can now search across all imported conversations via MCP

Troubleshooting

Server not connecting?

Make sure aimemory-mcp-server is in your PATH. Run which aimemory-mcp-server to verify.

Tools not showing?

Check Windsurf's MCP logs for errors. Ensure the transport is set to "stdio" not "http".

Python version issues?

AI Memory MCP requires Python 3.10+. Check with python3 --version.

Summary

Adding memory to Windsurf via MCP transforms it from a stateless code assistant into an AI that remembers your context, preferences, and past solutions. The setup takes just 5 minutes:

  1. Install: pip install aimemory-mcp-server
  2. Configure: Add to Windsurf MCP settings
  3. Restart: Reload Windsurf
  4. Use: Save and search memories in conversations

Best of all, it's completely free — no cloud costs, no API keys, no subscriptions. Your memories stay on your machine.

Ready to give Windsurf memory?

Install the AI Memory MCP server and transform your coding workflow.

pip install aimemory-mcp-server
View full MCP documentation →

Ready to organize your AI conversations?

Import your ChatGPT, Claude, and DeepSeek conversations into AI Memory. Search everything instantly.

Try AI Memory Free →

Related Articles