OpenClaw Memory System

A structured memory system for OpenClaw agents with semantic search, automatic evolution, and multi-container access.

Features

  • 🧠 Structured Storage - PostgreSQL-backed with rich metadata
  • 🔍 Semantic Search - Full-text search with flexible filtering
  • 📊 Analytics - Memory statistics and usage patterns
  • 🔄 Evolution - Automatic merging, archiving, and forgetting
  • 🌐 Multi-Container Access - REST API for all OpenClaw containers
  • 📝 Audit Trail - Complete mutation and access logging

Quick Start

# Create .env file
cp .env.example .env

# Edit .env with your database credentials
# nano .env

# Start the service
docker-compose up -d

# Check health
curl http://localhost:3000/health

Manual Docker Build

# Build image
docker build -t openclaw-memory:latest .

# Run container
docker run -d \
  --name openclaw-memory \
  -p 3000:3000 \
  -e DB_HOST=postgres \
  -e DB_PORT=5432 \
  -e DB_NAME=openclaw \
  -e DB_USER=postgres \
  -e DB_PASSWORD=your_password \
  openclaw-memory:latest

Local Development

# Install dependencies
npm install

# Run database migration
npm run migrate

# Start development server
npm run dev

Database Setup

The service requires PostgreSQL with the pgvector extension. If using an existing database:

CREATE EXTENSION IF NOT EXISTS vector;

Then run the migration:

npm run migrate

API Endpoints

Create Memory

curl -X POST http://localhost:3000/api/memories \
  -H "Content-Type: application/json" \
  -d '{
    "content": "The user prefers concise responses without filler words",
    "type": "preference",
    "category": "personal",
    "priority": 2,
    "tags": ["communication", "style"],
    "source_session": "session-123"
  }'

Get Memory

curl http://localhost:3000/api/memories/{id}

Update Memory

curl -X PUT http://localhost:3000/api/memories/{id} \
  -H "Content-Type: application/json" \
  -d '{
    "priority": 1,
    "tags": ["communication", "style", "important"]
  }'

Search Memories

curl "http://localhost:3000/api/memories/search?query=preference&type=preference&limit=10"

Get Context for Session

curl "http://localhost:3000/api/memories/context?session=session-123&limit=5"
curl http://localhost:3000/api/memories/{id}/related?limit=10

Merge Memories

curl -X POST http://localhost:3000/api/memories/{id}/merge \
  -H "Content-Type: application/json" \
  -d '{
    "target_id": "target-memory-id",
    "reason": "Similar content about user preferences"
  }'

Archive Memory

curl -X POST http://localhost:3000/api/memories/{id}/archive \
  -H "Content-Type: application/json" \
  -d '{
    "reason": "Outdated information"
  }'

Get Analytics

curl http://localhost:3000/api/memories/analytics

Memory Types

Type Description
event Something that happened
insight Learning or discovery
pattern Recurring behavior or theme
preference User preference
decision Decision made

Memory Categories

Category Description
work Work-related
personal Personal
technical Technical
social Social

Priority Levels

Priority Description
1 Critical (highest)
2 High
3 Medium (default)
4 Low
5 Trivial (lowest)

Integration with OpenClaw

Environment Variables

Set these in your OpenClaw containers:

MEMORY_API_URL=http://openclaw-memory:3000
MEMORY_API_ENABLED=true

Example: Create Memory from Agent

// In your agent code
const response = await fetch(`${process.env.MEMORY_API_URL}/api/memories`, {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
        content: "User corrected my response about X",
        type: "insight",
        category: "work",
        priority: 2,
        source_session: session.id
    })
});

Example: Get Context Before Session

// Load context memories before starting a new session
const response = await fetch(
    `${process.env.MEMORY_API_URL}/api/memories/context?session=${session.id}&limit=5`
);
const { data } = await response.json();

// Inject into system prompt
const systemPrompt = `
## Relevant Memories
${data.map(m => `- ${m.summary || m.content}`).join('\n')}
`;

Schema

See src/db/schema.sql for the complete database schema including:

  • memories - Main memories table
  • memory_accesses - Access log
  • memory_mutations - Mutation log

Migration from Files

The memory system can import data from:

  • MEMORY.md - Long-term memory
  • memory/YYYY-MM-DD.md - Daily memory files
  • .learnings/*.md - Learning records

A migration script will be provided in future versions.

Development

# Install dependencies
npm install

# Run in development mode
npm run dev

# Build for production
npm run build

# Run database migration
npm run migrate

License

MIT

Author

daotong

Description
No description provided
Readme 133 KiB
Languages
TypeScript 83.7%
PLpgSQL 13.7%
Dockerfile 2.6%