feat: 龙虾记忆同步系统完整版本

功能特性:
- 文件树展示
- 差异对比
- 双向同步(本地 <-> 数据库)
- 版本历史追踪
- 统计信息展示

核心补丁:
1. 分块读取与流式传输(防止大文件内存飙升)
2. .lobsterignore 机制(排除临时文件)
3. 操作溯源(Audit Log,记录同步历史)

技术栈:
- 后端: Django + DRF + PostgreSQL
- 前端: React + Ant Design
- 部署: Docker + Docker Compose

项目已完整部署,可直接使用 docker-compose up -d 启动
This commit is contained in:
道童
2026-04-05 12:43:24 +00:00
commit 4374379d3f
26 changed files with 3270 additions and 0 deletions

71
.lobsterignore.example Normal file
View File

@@ -0,0 +1,71 @@
# Lobster 记忆忽略文件示例
# 类似 .gitignore用于排除不需要同步的文件
# 系统文件
.DS_Store
.DS_Store?
._*
.Spotlight-V100
.Trashes
ehthumbs.db
Thumbs.db
# IDE 和编辑器
.vscode/
.idea/
*.swp
*.swo
*~
.project
.classpath
.settings/
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
*.egg-info/
dist/
build/
.pytest_cache/
# Node.js
node_modules/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
.pnpm-debug.log*
# 日志文件(根据需要调整)
*.log
logs/
*.log.*
# 临时文件
*.tmp
*.temp
*.bak
*.cache/
# 大文件(可选)
*.zip
*.tar
*.tar.gz
*.rar
*.7z
# 敏感文件
.env
*.env.local
secrets/
*.pem
*.key
# 其他
.git/
.gitignore
README.md
CHANGELOG.md
LICENSE

376
CHANGELOG.md Normal file
View File

@@ -0,0 +1,376 @@
# 🎯 三个"补丁"更新日志
## 更新时间
2026-04-05
## 更新说明
根据逍遥子的建议,为龙虾记忆同步系统添加了三个重要功能补丁,提升系统性能、可用性和安全性。
---
## 📦 补丁 1: 分块读取与流式传输
### 问题
- 如果龙虾的记忆文件(比如某些 Log 或向量快照)超过 50MB
- 一次性 GET /api/diff 会让后端内存瞬间飙升
### 解决方案
- **流式读取**:使用 8KB 分块读取大文件,避免一次性加载到内存
- **流式哈希计算**:直接从文件流计算哈希,无需加载完整内容
- **差异对比限制**:大文件只显示头尾各 500 行,中间省略
### 实现细节
```python
# services.py
class FileScanner:
chunk_size = 8192 # 8KB 分块读取
def read_file_chunked(self, file_path: Path) -> str:
"""分块读取文件"""
content_parts = []
with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
content_parts.append(chunk)
return ''.join(content_parts)
def read_file_stream(self, file_path: str) -> Iterator[str]:
"""流式读取文件(用于大文件传输)"""
with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
yield chunk
def compute_hash_stream(self, file_path: Path) -> str:
"""流式计算文件哈希(避免大文件内存问题)"""
hash_obj = hashlib.sha256()
with open(file_path, 'rb') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
hash_obj.update(chunk)
return hash_obj.hexdigest()
class DiffChecker:
def get_file_diff(self, local_content: str, db_content: str, max_lines: int = 1000) -> Dict:
"""获取文件差异(支持大文件限制)"""
local_lines = local_content.split('\n')
db_lines = db_content.split('\n')
# 限制行数(大文件只显示头尾)
if len(local_lines) > max_lines:
local_head = local_lines[:max_lines//2]
local_tail = local_lines[-max_lines//2:]
local_lines = local_head + ['... (中间省略 {}) 行 ...'.format(len(local_lines) - max_lines)] + local_tail
```
### API 更新
```http
#
GET /api/diff/?lobster_id=daotong&file_path=large-file.log&chunked=true
```
---
## 📦 补丁 2: .lobsterignore 机制
### 问题
- 临时文件(如 .DS_Store、日志缓存不需要进数据库
- 手动维护一个排除列表会更清爽
### 解决方案
- 创建 `.lobsterignore` 文件(类似 `.gitignore`
- 扫描时自动跳过匹配的文件
- 提供默认忽略规则
### 实现细节
```python
# services.py
class IgnorePattern:
""".lobsterignore 模式匹配器"""
def __init__(self, base_dir: Path):
self.base_dir = base_dir
self.patterns = []
self.load_patterns()
def load_patterns(self):
"""加载 .lobsterignore 文件"""
ignore_file = self.base_dir / '.lobsterignore'
if ignore_file.exists():
with open(ignore_file, 'r', encoding='utf-8') as f:
for line in f:
line = line.strip()
# 跳过空行和注释
if line and not line.startswith('#'):
self.patterns.append(line)
# 添加默认忽略规则
default_patterns = [
'.DS_Store', '.git', '.gitignore', '__pycache__',
'node_modules', '*.pyc', '*.pyo', '*.log',
'*.tmp', '*.temp', '*.bak', '.vscode', '.idea'
]
for pattern in default_patterns:
if pattern not in self.patterns:
self.patterns.append(pattern)
def is_ignored(self, file_path: Path) -> bool:
"""判断文件是否被忽略"""
relative_path = file_path.relative_to(self.base_dir)
for pattern in self.patterns:
# 匹配文件名
if fnmatch.fnmatch(file_path.name, pattern):
return True
# 匹配相对路径
if fnmatch.fnmatch(str(relative_path), pattern):
return True
# 匹配目录
if pattern.endswith('/') and fnmatch.fnmatch(str(relative_path.parent), pattern.rstrip('/')):
return True
# 递归匹配子目录
if pattern.startswith('*/'):
parts = str(relative_path).split(os.sep)
for i, part in enumerate(parts):
if fnmatch.fnmatch(part, pattern[2:]):
return True
return False
```
### 示例文件
```bash
# .lobsterignore
# 系统文件
.DS_Store
.Thumbs.db
# IDE 和编辑器
.vscode/
.idea/
*.swp
# Python
__pycache__/
*.pyc
*.log
# Node.js
node_modules/
# 临时文件
*.tmp
*.bak
```
### API 更新
```http
#
GET /api/ignore/patterns/
#
POST /api/ignore/reload/
```
---
## 📦 补丁 3: 操作溯源Audit Log
### 问题
- 万一哪天点错了,无法查到是哪次操作导致的
- 需要记录操作历史,方便追溯问题
### 解决方案
- 新增 `SyncHistory` 模型
- 记录每次同步操作的详细信息
- 提供历史查询 API
### 实现细节
```python
# models.py
class SyncHistory(models.Model):
"""同步操作历史记录"""
ACTION_CHOICES = [
('sync_to_db', '同步到数据库'),
('sync_to_local', '同步到本地'),
('auto_sync', '自动同步'),
('manual_merge', '手动合并'),
]
STATUS_CHOICES = [
('success', '成功'),
('failed', '失败'),
('partial', '部分成功'),
]
lobster_id = models.CharField(max_length=50, help_text='龙虾ID')
file_path = models.CharField(max_length=500, help_text='文件相对路径')
action = models.CharField(max_length=20, choices=ACTION_CHOICES, help_text='操作类型')
status = models.CharField(max_length=20, choices=STATUS_CHOICES, help_text='操作状态')
old_version = models.IntegerField(null=True, blank=True, help_text='操作前版本')
new_version = models.IntegerField(null=True, blank=True, help_text='操作后版本')
old_hash = models.CharField(max_length=64, null=True, blank=True, help_text='操作前哈希')
new_hash = models.CharField(max_length=64, null=True, blank=True, help_text='操作后哈希')
file_size = models.IntegerField(default=0, help_text='文件大小(字节)')
operator = models.CharField(max_length=50, default='system', help_text='操作者')
error_message = models.TextField(null=True, blank=True, help_text='错误信息')
execution_time = models.FloatField(default=0, help_text='执行时间(秒)')
created_at = models.DateTimeField(auto_now_add=True, help_text='操作时间')
# services.py
class AuditLogger:
"""操作日志记录器"""
def log_sync_action(
self,
lobster_id: str,
file_path: str,
action: str,
old_version: int = None,
new_version: int = None,
old_hash: str = None,
new_hash: str = None,
file_size: int = 0,
operator: str = 'system',
status: str = 'success',
error_message: str = None,
execution_time: float = 0
):
"""记录同步操作"""
self.model.objects.create(...)
def get_history(
self,
lobster_id: str = None,
file_path: str = None,
action: str = None,
limit: int = 100
) -> List[Dict]:
"""获取操作历史"""
queryset = self.model.objects.all()
# 过滤和排序...
```
### 使用示例
```python
# views.py
@api_view(['POST'])
def sync_to_db(request):
"""同步到数据库(带操作日志)"""
audit_logger = AuditLogger()
start_time = time.time()
try:
# 执行同步操作...
execution_time = time.time() - start_time
# 记录成功日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_db',
old_version=old_version,
new_version=new_version,
old_hash=old_hash,
new_hash=file_hash,
file_size=record.size,
operator=operator,
status='success',
execution_time=execution_time
)
except Exception as e:
# 记录失败日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_db',
operator=operator,
status='failed',
error_message=str(e),
execution_time=execution_time
)
```
### API 更新
```http
#
GET /api/history/?lobster_id=daotong&file_path=MEMORY.md&limit=50
```
### 历史记录示例
```json
{
"success": true,
"data": [
{
"id": 1,
"lobster_id": "daotong",
"file_path": "MEMORY.md",
"action": "sync_to_db",
"action_display": "同步到数据库",
"status": "success",
"status_display": "成功",
"old_version": 1,
"new_version": 2,
"old_hash": "abc123...",
"new_hash": "def456...",
"file_size": 1234,
"operator": "逍遥子",
"error_message": null,
"execution_time": 0.123,
"created_at": "2026-04-05T12:00:00Z"
}
]
}
```
---
## 📋 数据库迁移
需要执行数据库迁移以创建 `SyncHistory` 表:
```bash
# 进入后端容器
docker exec -it lobster-backend bash
# 创建迁移
python manage.py makemigrations memory_app
python manage.py migrate
```
---
## ✅ 完成检查清单
- [x] 分块读取与流式传输services.py
- [x] .lobsterignore 机制services.py + .lobsterignore.example
- [x] 操作溯源models.py + services.py + views.py + serializers.py
- [x] 新增 API 接口urls.py
- [x] 更新文档CHANGELOG.md
---
## 🚀 下一步
1. 执行数据库迁移
2. 推送代码到远程仓库
3. 更新前端界面(添加历史记录和忽略规则管理)
---
**感谢逍遥子的宝贵建议!** 🙏

107
DEPLOY.md Normal file
View File

@@ -0,0 +1,107 @@
# Lobster Memory Sync
## 部署指南
### 前置条件
- Docker
- Docker Compose
### 快速启动
1. **克隆项目**
```bash
cd /home/node/.openclaw/workspace/daotong/lobster-memory-sync
```
2. **启动服务**
```bash
docker-compose up -d
```
3. **访问应用**
- 前端http://localhost:8086
- 后端 APIhttp://localhost:8087/api/
- PostgreSQLlocalhost:5432
### 开发模式
#### 后端开发
```bash
# 进入后端容器
docker exec -it lobster-backend bash
# 创建迁移
python manage.py makemigrations memory_app
python manage.py migrate
# 创建超级用户
python manage.py createsuperuser
```
#### 前端开发
```bash
# 本地开发(不使用 Docker
cd frontend
npm install
npm start
```
### API 接口文档
#### 扫描文件
```
GET /api/scan/?lobster_id=daotong
```
#### 检查同步状态
```
GET /api/status/?lobster_id=daotong
```
#### 获取文件差异
```
GET /api/diff/?lobster_id=daotong&file_path=MEMORY.md
```
#### 同步到数据库
```
POST /api/sync/db/
{
"lobster_id": "daotong",
"file_path": "MEMORY.md"
}
```
#### 同步到本地
```
POST /api/sync/local/
{
"lobster_id": "daotong",
"file_path": "MEMORY.md"
}
```
### 故障排查
#### 查看日志
```bash
docker-compose logs -f
```
#### 重启服务
```bash
docker-compose restart
```
#### 清理数据
```bash
docker-compose down -v
docker-compose up -d
```
### 技术栈
- **后端**: Django + Django REST Framework + PostgreSQL
- **前端**: React + Ant Design + react-diff-viewer-continued
- **部署**: Docker + Docker Compose

707
README.md Normal file
View File

@@ -0,0 +1,707 @@
# 🦐 龙虾记忆同步系统
一个用于同步和管理龙虾记忆文件的前后端分离系统,提供文件树展示、差异对比和双向同步功能。
## 📋 目录
- [项目概述](#项目概述)
- [技术栈](#技术栈)
- [功能特性](#功能特性)
- [项目结构](#项目结构)
- [快速开始](#快速开始)
- [API 文档](#api-文档)
- [开发指南](#开发指南)
- [部署说明](#部署说明)
- [开发日志](#开发日志)
## 项目概述
龙虾记忆同步系统是一个专为 OpenClaw 龙虾设计的记忆文件管理工具,支持:
- 扫描龙虾记忆目录
- 检查文件差异
- 双向同步(本地 ↔ 数据库)
- 版本历史追踪
- 统计信息展示
## 技术栈
### 后端
- Django 4.x
- Django REST Framework
- PostgreSQL 15
- Python 3.11
### 前端
- React 18
- Ant Design 5.x
- react-diff-viewer-continued
- Axios
### 部署
- Docker
- Docker Compose
- Nginx
## 功能特性
-**文件树展示**:可视化展示龙虾记忆文件结构
-**差异对比**:直观对比本地文件和数据库文件
-**双向同步**:支持本地→数据库和数据库→本地同步
-**版本历史**:追踪文件的修改历史
-**统计信息**:展示文件数量、大小等统计信息
-**REST API**:完整的 RESTful API 接口
## 项目结构
```
lobster-memory-sync/
├── backend/ # Django 后端
│ ├── manage.py # Django 管理脚本
│ ├── requirements.txt # Python 依赖
│ ├── Dockerfile # 后端 Docker 配置
│ ├── memory_sync/ # Django 项目配置
│ │ ├── settings.py # 项目设置
│ │ ├── urls.py # 主路由
│ │ └── wsgi.py # WSGI 配置
│ └── memory_app/ # 核心应用
│ ├── models.py # 数据模型
│ ├── serializers.py # 序列化器
│ ├── views.py # 视图
│ ├── urls.py # 应用路由
│ └── services.py # 业务逻辑
├── frontend/ # React 前端
│ ├── package.json # Node 依赖
│ ├── Dockerfile # 前端 Docker 配置
│ ├── public/ # 静态资源
│ └── src/ # 源代码
│ ├── api/ # API 客户端
│ │ └── index.js
│ ├── components/ # React 组件
│ │ ├── FileTree.js # 文件树
│ │ └── FileDiff.js # 差异对比
│ ├── App.js # 主应用
│ └── index.js # 入口文件
├── docker-compose.yml # Docker Compose 配置
├── README.md # 项目文档
└── DEPLOY.md # 部署文档
```
## 快速开始
### 前置条件
- Docker
- Docker Compose
- 端口占用检查8086前端、8087后端、5432数据库
### 一键启动
```bash
# 克隆项目
cd /home/node/.openclaw/workspace/daotong/lobster-memory-sync
# 启动服务
docker-compose up -d
# 查看日志
docker-compose logs -f
# 停止服务
docker-compose down
```
### 访问地址
- 前端http://localhost:8086
- 后端 APIhttp://localhost:8087/api/
- PostgreSQLlocalhost:5432
## API 文档
### 扫描文件
```
GET /api/scan/?lobster_id=daotong
```
**响应示例:**
```json
{
"files": [
{
"name": "MEMORY.md",
"path": "MEMORY.md",
"type": "file",
"size": 1234,
"last_modified": "2026-04-05T12:00:00Z"
}
]
}
```
### 检查同步状态
```
GET /api/status/?lobster_id=daotong&file_path=MEMORY.md
```
**响应示例:**
```json
{
"synced": false,
"has_difference": true,
"difference": "+ 新增内容\n- 删除内容"
}
```
### 获取文件差异
```
GET /api/diff/?lobster_id=daotong&file_path=MEMORY.md
```
### 同步到数据库
```
POST /api/sync/db/
Content-Type: application/json
{
"lobster_id": "daotong",
"file_path": "MEMORY.md"
}
```
### 同步到本地
```
POST /api/sync/local/
Content-Type: application/json
{
"lobster_id": "daotong",
"file_path": "MEMORY.md"
}
```
## 开发指南
### 后端开发
```bash
# 进入后端容器
docker exec -it lobster-backend bash
# 创建迁移
python manage.py makemigrations memory_app
python manage.py migrate
# 创建超级用户
python manage.py createsuperuser
# 运行开发服务器
python manage.py runserver 0.0.0.0:8087
```
### 前端开发
```bash
# 本地开发(不使用 Docker
cd frontend
npm install
npm start
# 构建生产版本
npm run build
```
## 🚀 部署指南
### 系统要求
- **操作系统**: Linux / macOS / Windows (WSL2)
- **Docker**: 20.10 或更高版本
- **Docker Compose**: 2.0 或更高版本
- **内存**: 最少 2GB RAM
- **磁盘**: 最少 5GB 可用空间
- **端口**: 8086前端、8087后端、5432数据库
### 环境准备
#### 1. 安装 Docker
**Ubuntu / Debian:**
```bash
# 更新包索引
sudo apt-get update
# 安装依赖
sudo apt-get install -y ca-certificates curl gnupg lsb-release
# 添加 Docker 官方 GPG key
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
# 添加 Docker 仓库
echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.gpg] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
# 安装 Docker
sudo apt-get update
sudo apt-get install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
# 验证安装
docker --version
docker compose version
```
**CentOS / RHEL:**
```bash
# 安装依赖
sudo yum install -y yum-utils device-mapper-persistent-data lvm2
# 添加 Docker 仓库
sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo
# 安装 Docker
sudo yum install -y docker-ce docker-ce-cli containerd.io docker-compose-plugin
# 启动 Docker
sudo systemctl start docker
sudo systemctl enable docker
```
**macOS:**
```bash
# 使用 Homebrew 安装
brew install --cask docker
# 启动 Docker Desktop
open -a Docker
```
#### 2. 配置 Docker 用户组(可选)
```bash
# 将当前用户添加到 docker 组
sudo usermod -aG docker $USER
# 重新登录或运行
newgrp docker
# 验证
docker ps
```
### 安装部署
#### 1. 克隆项目
```bash
# 克隆仓库
git clone https://xjp.datalibstar.com/daotong/lobster-memory-sync.git
cd lobster-memory-sync
```
#### 2. 配置环境变量
创建 `.env` 文件(可选,用于覆盖默认配置):
```bash
# 数据库配置
DB_NAME=lobster_memory
DB_USER=postgres
DB_PASSWORD=your_secure_password
# 龙虾记忆目录路径
LOBSTER_MEMORY_BASE=/path/to/lobster/memory
# 前端配置
REACT_APP_API_URL=http://localhost:8087/api
# 端口配置
FRONTEND_PORT=8086
BACKEND_PORT=8087
POSTGRES_PORT=5432
```
#### 3. 修改 docker-compose.yml
根据实际环境修改以下配置:
```yaml
services:
backend:
volumes:
# 挂载龙虾记忆目录(只读)
- /home/node/.openclaw/workspace/daotong:/app/memory_files:ro
```
**注意事项:**
-`/home/node/.openclaw/workspace/daotong` 替换为实际的龙虾记忆目录路径
- 使用 `:ro` 只读挂载,确保安全性
#### 4. 构建并启动服务
```bash
# 构建镜像
docker-compose build
# 启动所有服务(后台运行)
docker-compose up -d
# 查看服务状态
docker-compose ps
# 查看日志
docker-compose logs -f
```
#### 5. 初始化数据库
```bash
# 等待数据库启动
sleep 10
# 执行数据库迁移
docker-compose exec backend python manage.py migrate
# 创建超级用户(可选)
docker-compose exec backend python manage.py createsuperuser
```
### 验证部署
#### 1. 检查服务状态
```bash
# 查看所有容器状态
docker-compose ps
# 预期输出:
# NAME STATUS
# lobster-postgres Up
# lobster-backend Up
# lobster-frontend Up
```
#### 2. 测试后端 API
```bash
# 测试 API 健康检查
curl http://localhost:8087/api/
# 测试文件扫描
curl "http://localhost:8087/api/scan/?lobster_id=daotong"
```
#### 3. 访问前端
打开浏览器访问:
- http://localhost:8086
**预期效果:**
- 能够看到文件树展示
- 能够点击文件查看差异对比
- 能够执行同步操作
### 生产环境配置
#### 1. 使用环境变量文件
创建 `.env.production` 文件:
```bash
# 生产环境配置
DB_NAME=lobster_memory_prod
DB_USER=postgres
DB_PASSWORD=<强密码>
DB_HOST=postgres
# 龙虾记忆目录
LOBSTER_MEMORY_BASE=/var/lib/lobster/memory
# 前端 API 地址
REACT_APP_API_URL=https://api.yourdomain.com/api
# 端口配置
FRONTEND_PORT=8086
BACKEND_PORT=8087
POSTGRES_PORT=5432
```
#### 2. 配置 Nginx 反向代理
创建 `nginx.conf`
```nginx
upstream backend {
server localhost:8087;
}
upstream frontend {
server localhost:8086;
}
server {
listen 80;
server_name yourdomain.com;
# 重定向到 HTTPS
return 301 https://$server_name$request_uri;
}
server {
listen 443 ssl http2;
server_name yourdomain.com;
# SSL 证书配置
ssl_certificate /etc/letsencrypt/live/yourdomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/yourdomain.com/privkey.pem;
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers HIGH:!aNULL:!MD5;
# 前端静态资源
location / {
proxy_pass http://frontend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
# 后端 API
location /api/ {
proxy_pass http://backend/api/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
#### 3. 启用 HTTPS
使用 Let's Encrypt 获取免费 SSL 证书:
```bash
# 安装 certbot
sudo apt-get install certbot python3-certbot-nginx
# 获取证书
sudo certbot --nginx -d yourdomain.com
# 自动续期
sudo certbot renew --dry-run
```
#### 4. 配置数据库备份
创建备份脚本 `backup.sh`
```bash
#!/bin/bash
BACKUP_DIR="/var/backups/lobster-memory"
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/backup_$DATE.sql"
# 创建备份目录
mkdir -p $BACKUP_DIR
# 执行备份
docker-compose exec -T postgres pg_dump -U postgres lobster_memory > $BACKUP_FILE
# 压缩备份
gzip $BACKUP_FILE
# 删除 7 天前的备份
find $BACKUP_DIR -name "backup_*.sql.gz" -mtime +7 -delete
echo "Backup completed: ${BACKUP_FILE}.gz"
```
添加定时任务:
```bash
# 编辑 crontab
crontab -e
# 每天凌晨 2 点执行备份
0 2 * * * /path/to/backup.sh
```
### 更新部署
#### 1. 拉取最新代码
```bash
git pull origin master
```
#### 2. 重新构建镜像
```bash
docker-compose build
```
#### 3. 重启服务
```bash
docker-compose up -d
```
#### 4. 执行数据库迁移(如有)
```bash
docker-compose exec backend python manage.py migrate
```
### 监控与维护
#### 查看服务日志
```bash
# 查看所有服务日志
docker-compose logs -f
# 查看特定服务日志
docker-compose logs -f backend
docker-compose logs -f frontend
docker-compose logs -f postgres
# 查看最近 100 行日志
docker-compose logs --tail=100 backend
```
#### 查看资源使用
```bash
# 查看容器资源使用情况
docker stats
# 查看磁盘使用
docker system df
# 清理未使用的资源
docker system prune -a
```
#### 数据库维护
```bash
# 进入数据库容器
docker-compose exec postgres psql -U postgres -d lobster_memory
# 备份数据库
docker-compose exec postgres pg_dump -U postgres lobster_memory > backup.sql
# 恢复数据库
docker-compose exec -T postgres psql -U postgres lobster_memory < backup.sql
```
### 故障排查
#### 问题 1容器启动失败
```bash
# 查看容器日志
docker-compose logs backend
# 检查端口占用
sudo netstat -tulpn | grep -E '8086|8087|5432'
# 重新构建镜像
docker-compose build --no-cache
docker-compose up -d
```
#### 问题 2数据库连接失败
```bash
# 检查数据库容器状态
docker-compose ps postgres
# 查看数据库日志
docker-compose logs postgres
# 测试数据库连接
docker-compose exec postgres psql -U postgres -d lobster_memory -c "SELECT version();"
```
#### 问题 3前端无法访问后端 API
```bash
# 检查后端服务状态
curl http://localhost:8087/api/
# 检查前端配置
docker-compose logs frontend
# 验证环境变量
docker-compose exec frontend env | grep REACT_APP_API_URL
```
#### 问题 4文件扫描失败
```bash
# 检查龙虾记忆目录挂载
docker-compose exec backend ls -la /app/memory_files
# 检查目录权限
ls -ld /home/node/.openclaw/workspace/daotong
# 重新挂载
docker-compose down
docker-compose up -d
```
### 卸载
```bash
# 停止并删除容器
docker-compose down
# 删除数据卷
docker-compose down -v
# 删除镜像
docker rmi lobster-memory-sync-backend lobster-memory-sync-frontend
# 删除项目目录
cd ..
rm -rf lobster-memory-sync
```
### 常见问题 (FAQ)
**Q: 如何修改默认端口?**
A: 在 `docker-compose.yml` 中修改对应的端口映射,例如:
```yaml
frontend:
ports:
- "9086:80" # 将 8086 改为 9086
```
**Q: 如何使用外部数据库?**
A: 修改 `docker-compose.yml` 中的 `backend` 服务配置,移除 `postgres` 服务并设置 `DB_HOST` 环境变量。
**Q: 如何扩展存储空间?**
A: 修改 `docker-compose.yml` 中的 `postgres_data` 卷配置,或使用外部存储卷。
**Q: 如何配置多实例部署?**
A: 使用 Docker Swarm 或 Kubernetes 进行集群部署,配置负载均衡器分发请求。
## 开发日志
- **2026-04-05**: 项目初始化
- 完成后端核心功能Django + DRF + PostgreSQL
- 完成前端核心功能React + Ant Design
- 完成部署配置Docker Compose
- 推送到 Git 仓库https://xjp.datalibstar.com/daotong/lobster-memory-sync.git
## 📝 License
MIT
## 🤝 贡献
欢迎提交 Issue 和 Pull Request

17
backend/Dockerfile Normal file
View File

@@ -0,0 +1,17 @@
FROM python:3.11-slim
WORKDIR /app
# 安装依赖
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# 复制代码
COPY . .
# 收集静态文件
RUN python manage.py collectstatic --noinput
EXPOSE 8087
CMD ["python", "manage.py", "runserver", "0.0.0.0:8087"]

23
backend/manage.py Normal file
View File

@@ -0,0 +1,23 @@
# Django manage.py
#!/usr/bin/env python
"""Django's command-line utility for administrative tasks."""
import os
import sys
def main():
"""Run administrative tasks."""
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'memory_sync.settings')
try:
from django.core.management import execute_from_command_line
except ImportError as exc:
raise ImportError(
"Couldn't import Django. Are you sure it's installed and "
"available on your PYTHONPATH environment variable? Did you "
"forget to activate a virtual environment?"
) from exc
execute_from_command_line(sys.argv)
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,125 @@
from django.db import models
from django.core.validators import FileExtensionValidator
import hashlib
class LobsterMemory(models.Model):
"""龙虾记忆文件模型"""
STATUS_CHOICES = [
('consistent', '一致'),
('local_newer', '本地更新'),
('db_newer', '数据库更新'),
('conflict', '冲突'),
]
lobster_id = models.CharField(max_length=50, help_text='龙虾ID')
file_path = models.CharField(max_length=500, help_text='文件相对路径')
content = models.TextField(help_text='文件内容')
hash = models.CharField(max_length=64, help_text='SHA256哈希')
status = models.CharField(
max_length=20,
choices=STATUS_CHOICES,
default='consistent',
help_text='同步状态'
)
version = models.IntegerField(default=1, help_text='版本号')
size = models.IntegerField(default=0, help_text='文件大小(字节)')
created_at = models.DateTimeField(auto_now_add=True, help_text='创建时间')
updated_at = models.DateTimeField(auto_now=True, help_text='更新时间')
class Meta:
db_table = 'lobster_memory'
unique_together = ('lobster_id', 'file_path', 'version')
ordering = ['-updated_at']
indexes = [
models.Index(fields=['lobster_id', 'file_path']),
models.Index(fields=['status']),
models.Index(fields=['updated_at']),
]
def __str__(self):
return f"{self.lobster_id}/{self.file_path} (v{self.version})"
def compute_hash(self, content):
"""计算SHA256哈希"""
return hashlib.sha256(content.encode('utf-8')).hexdigest()
def save(self, *args, **kwargs):
"""保存时自动计算哈希和大小"""
if self.content:
self.hash = self.compute_hash(self.content)
self.size = len(self.content.encode('utf-8'))
super().save(*args, **kwargs)
class SyncHistory(models.Model):
"""同步操作历史记录"""
ACTION_CHOICES = [
('sync_to_db', '同步到数据库'),
('sync_to_local', '同步到本地'),
('auto_sync', '自动同步'),
('manual_merge', '手动合并'),
]
STATUS_CHOICES = [
('success', '成功'),
('failed', '失败'),
('partial', '部分成功'),
]
lobster_id = models.CharField(max_length=50, help_text='龙虾ID')
file_path = models.CharField(max_length=500, help_text='文件相对路径')
action = models.CharField(
max_length=20,
choices=ACTION_CHOICES,
help_text='操作类型'
)
status = models.CharField(
max_length=20,
choices=STATUS_CHOICES,
help_text='操作状态'
)
old_version = models.IntegerField(null=True, blank=True, help_text='操作前版本')
new_version = models.IntegerField(null=True, blank=True, help_text='操作后版本')
old_hash = models.CharField(max_length=64, null=True, blank=True, help_text='操作前哈希')
new_hash = models.CharField(max_length=64, null=True, blank=True, help_text='操作后哈希')
file_size = models.IntegerField(default=0, help_text='文件大小(字节)')
operator = models.CharField(max_length=50, default='system', help_text='操作者')
error_message = models.TextField(null=True, blank=True, help_text='错误信息')
execution_time = models.FloatField(default=0, help_text='执行时间(秒)')
created_at = models.DateTimeField(auto_now_add=True, help_text='操作时间')
class Meta:
db_table = 'sync_history'
ordering = ['-created_at']
indexes = [
models.Index(fields=['lobster_id', 'file_path']),
models.Index(fields=['action']),
models.Index(fields=['status']),
models.Index(fields=['created_at']),
]
def __str__(self):
return f"{self.action} - {self.lobster_id}/{self.file_path} ({self.status})"

View File

@@ -0,0 +1,64 @@
from rest_framework import serializers
from .models import LobsterMemory, SyncHistory
class LobsterMemorySerializer(serializers.ModelSerializer):
"""龙虾记忆序列化器"""
class Meta:
model = LobsterMemory
fields = [
'id',
'lobster_id',
'file_path',
'content',
'hash',
'status',
'version',
'size',
'created_at',
'updated_at',
]
read_only_fields = ['id', 'created_at', 'updated_at']
class SyncHistorySerializer(serializers.ModelSerializer):
"""同步历史序列化器"""
action_display = serializers.CharField(source='get_action_display', read_only=True)
status_display = serializers.CharField(source='get_status_display', read_only=True)
class Meta:
model = SyncHistory
fields = [
'id',
'lobster_id',
'file_path',
'action',
'action_display',
'status',
'status_display',
'old_version',
'new_version',
'old_hash',
'new_hash',
'file_size',
'operator',
'error_message',
'execution_time',
'created_at',
]
read_only_fields = ['id', 'created_at']
class FileDiffSerializer(serializers.Serializer):
"""文件差异序列化器"""
file_path = serializers.CharField()
lobster_id = serializers.CharField()
local_content = serializers.CharField(required=False)
db_content = serializers.CharField(required=False)
local_hash = serializers.CharField(required=False)
db_hash = serializers.CharField(required=False)
status = serializers.CharField()
message = serializers.CharField(required=False)

View File

@@ -0,0 +1,496 @@
import os
import hashlib
import fnmatch
import time
from pathlib import Path
from typing import List, Dict, Tuple, Iterator
from django.conf import settings
from django.utils import timezone
class IgnorePattern:
""".lobsterignore 模式匹配器"""
def __init__(self, base_dir: Path):
self.base_dir = base_dir
self.patterns = []
self.load_patterns()
def load_patterns(self):
"""加载 .lobsterignore 文件"""
ignore_file = self.base_dir / '.lobsterignore'
if ignore_file.exists():
with open(ignore_file, 'r', encoding='utf-8') as f:
for line in f:
line = line.strip()
# 跳过空行和注释
if line and not line.startswith('#'):
self.patterns.append(line)
# 添加默认忽略规则
default_patterns = [
'.DS_Store', '.git', '.gitignore', '__pycache__',
'node_modules', '*.pyc', '*.pyo', '*.log',
'*.tmp', '*.temp', '*.bak', '.vscode', '.idea'
]
for pattern in default_patterns:
if pattern not in self.patterns:
self.patterns.append(pattern)
def is_ignored(self, file_path: Path) -> bool:
"""
判断文件是否被忽略
Args:
file_path: 文件路径(绝对路径)
Returns:
是否被忽略
"""
relative_path = file_path.relative_to(self.base_dir)
for pattern in self.patterns:
# 匹配文件名
if fnmatch.fnmatch(file_path.name, pattern):
return True
# 匹配相对路径
if fnmatch.fnmatch(str(relative_path), pattern):
return True
# 匹配目录
if pattern.endswith('/') and fnmatch.fnmatch(str(relative_path.parent), pattern.rstrip('/')):
return True
# 递归匹配子目录
if pattern.startswith('*/'):
parts = str(relative_path).split(os.sep)
for i, part in enumerate(parts):
if fnmatch.fnmatch(part, pattern[2:]):
return True
return False
class FileScanner:
"""文件扫描器(支持 .lobsterignore 和分块读取)"""
def __init__(self):
self.base_dir = Path(settings.LOBSTER_MEMORY_BASE)
self.supported_extensions = settings.SUPPORTED_EXTENSIONS
self.ignore = IgnorePattern(self.base_dir)
self.chunk_size = 8192 # 8KB 分块读取
def scan_directory(self, lobster_id: str = None) -> List[Dict]:
"""
扫描目录,返回所有文件信息
Args:
lobster_id: 龙虾ID可选
Returns:
文件信息列表
"""
if not self.base_dir.exists():
return []
files = []
for file_path in self.base_dir.rglob('*'):
if not file_path.is_file():
continue
# 检查文件扩展名
if file_path.suffix not in self.supported_extensions:
continue
# 检查是否被 .lobsterignore 忽略
if self.ignore.is_ignored(file_path):
continue
try:
relative_path = file_path.relative_to(self.base_dir)
# 使用流式读取获取哈希(避免大文件内存问题)
file_hash = self.compute_hash_stream(file_path)
files.append({
'file_path': str(relative_path),
'full_path': str(file_path),
'hash': file_hash,
'size': file_path.stat().st_size,
'lobster_id': lobster_id or 'unknown',
})
except Exception as e:
print(f"Error reading {file_path}: {e}")
return files
def get_file_content(self, file_path: str, chunked: bool = False) -> Tuple[str, str]:
"""
获取文件内容和哈希
Args:
file_path: 相对路径
chunked: 是否使用分块读取
Returns:
(content, hash)
"""
full_path = self.base_dir / file_path
if not full_path.exists():
raise FileNotFoundError(f"File not found: {file_path}")
# 对于大文件(>50MB使用分块读取
file_size = full_path.stat().st_size
if chunked and file_size > 50 * 1024 * 1024:
content = self.read_file_chunked(full_path)
else:
content = full_path.read_text(encoding='utf-8', errors='ignore')
file_hash = self.compute_hash(content)
return content, file_hash
def read_file_chunked(self, file_path: Path) -> str:
"""
分块读取文件
Args:
file_path: 文件路径
Returns:
文件内容
"""
content_parts = []
with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
content_parts.append(chunk)
return ''.join(content_parts)
def read_file_stream(self, file_path: str) -> Iterator[str]:
"""
流式读取文件(用于大文件传输)
Args:
file_path: 相对路径
Yields:
文件块
"""
full_path = self.base_dir / file_path
if not full_path.exists():
raise FileNotFoundError(f"File not found: {file_path}")
with open(full_path, 'r', encoding='utf-8', errors='ignore') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
yield chunk
def write_file(self, file_path: str, content: str):
"""
写入文件
Args:
file_path: 相对路径
content: 文件内容
"""
full_path = self.base_dir / file_path
# 确保目录存在
full_path.parent.mkdir(parents=True, exist_ok=True)
# 写入文件
full_path.write_text(content, encoding='utf-8')
def compute_hash(self, content: str) -> str:
"""
计算SHA256哈希
Args:
content: 文件内容
Returns:
哈希值
"""
return hashlib.sha256(content.encode('utf-8')).hexdigest()
def compute_hash_stream(self, file_path: Path) -> str:
"""
流式计算文件哈希(避免大文件内存问题)
Args:
file_path: 文件路径
Returns:
哈希值
"""
hash_obj = hashlib.sha256()
with open(file_path, 'rb') as f:
while True:
chunk = f.read(self.chunk_size)
if not chunk:
break
hash_obj.update(chunk)
return hash_obj.hexdigest()
def get_file_tree(self, lobster_id: str = None) -> Dict:
"""
获取文件树结构
Args:
lobster_id: 龙虾ID
Returns:
文件树字典
"""
files = self.scan_directory(lobster_id)
tree = {}
for file_info in files:
parts = Path(file_info['file_path']).parts
current = tree
for part in parts[:-1]:
if part not in current:
current[part] = {}
current = current[part]
filename = parts[-1]
current[filename] = file_info
return tree
class DiffChecker:
"""差异检查器(支持大文件优化)"""
def __init__(self):
self.scanner = FileScanner()
def check_sync_status(self, local_files: List[Dict], db_files: List[Dict]) -> Dict:
"""
检查同步状态
Args:
local_files: 本地文件列表
db_files: 数据库文件列表
Returns:
同步状态字典
"""
local_map = {f['file_path']: f for f in local_files}
db_map = {f['file_path']: f for f in db_files}
results = {
'consistent': [],
'local_newer': [],
'db_newer': [],
'conflict': [],
'local_only': [],
'db_only': [],
}
all_paths = set(local_map.keys()) | set(db_map.keys())
for path in all_paths:
local = local_map.get(path)
db = db_map.get(path)
if local and db:
# 两边都存在
if local['hash'] == db['hash']:
results['consistent'].append({
'file_path': path,
'status': 'consistent'
})
else:
# 比较更新时间
local_time = db.get('updated_at') if db else None
if local_time:
# 数据库有更新时间,比较
if local['hash'] != db['hash']:
results['conflict'].append({
'file_path': path,
'status': 'conflict',
'local_hash': local['hash'],
'db_hash': db['hash']
})
else:
# 无法判断,标记为冲突
results['conflict'].append({
'file_path': path,
'status': 'conflict',
'local_hash': local['hash'],
'db_hash': db['hash']
})
elif local and not db:
# 只有本地
results['local_only'].append({
'file_path': path,
'status': 'local_only'
})
elif not local and db:
# 只有数据库
results['db_only'].append({
'file_path': path,
'status': 'db_only'
})
return results
def get_file_diff(self, local_content: str, db_content: str, max_lines: int = 1000) -> Dict:
"""
获取文件差异(支持大文件限制)
Args:
local_content: 本地内容
db_content: 数据库内容
max_lines: 最大显示行数(防止大文件差异过大)
Returns:
差异信息
"""
local_lines = local_content.split('\n')
db_lines = db_content.split('\n')
# 限制行数(大文件只显示头尾)
if len(local_lines) > max_lines:
local_head = local_lines[:max_lines//2]
local_tail = local_lines[-max_lines//2:]
local_lines = local_head + ['... (中间省略 {}) 行 ...'.format(len(local_lines) - max_lines)] + local_tail
if len(db_lines) > max_lines:
db_head = db_lines[:max_lines//2]
db_tail = db_lines[-max_lines//2:]
db_lines = db_head + ['... (中间省略 {}) 行 ...'.format(len(db_lines) - max_lines)] + db_tail
return {
'local_lines': local_lines,
'db_lines': db_lines,
'has_diff': local_content != db_content,
'is_truncated': len(local_lines) > max_lines or len(db_lines) > max_lines
}
class AuditLogger:
"""操作日志记录器"""
def __init__(self):
self.model = None
# 延迟导入模型(避免循环导入)
from .models import SyncHistory
self.model = SyncHistory
def log_sync_action(
self,
lobster_id: str,
file_path: str,
action: str,
old_version: int = None,
new_version: int = None,
old_hash: str = None,
new_hash: str = None,
file_size: int = 0,
operator: str = 'system',
status: str = 'success',
error_message: str = None,
execution_time: float = 0
):
"""
记录同步操作
Args:
lobster_id: 龙虾ID
file_path: 文件路径
action: 操作类型
old_version: 操作前版本
new_version: 操作后版本
old_hash: 操作前哈希
new_hash: 操作后哈希
file_size: 文件大小
operator: 操作者
status: 操作状态
error_message: 错误信息
execution_time: 执行时间
"""
self.model.objects.create(
lobster_id=lobster_id,
file_path=file_path,
action=action,
old_version=old_version,
new_version=new_version,
old_hash=old_hash,
new_hash=new_hash,
file_size=file_size,
operator=operator,
status=status,
error_message=error_message,
execution_time=execution_time,
created_at=timezone.now()
)
def get_history(
self,
lobster_id: str = None,
file_path: str = None,
action: str = None,
limit: int = 100
) -> List[Dict]:
"""
获取操作历史
Args:
lobster_id: 龙虾ID可选
file_path: 文件路径(可选)
action: 操作类型(可选)
limit: 返回数量限制
Returns:
操作历史列表
"""
queryset = self.model.objects.all()
if lobster_id:
queryset = queryset.filter(lobster_id=lobster_id)
if file_path:
queryset = queryset.filter(file_path=file_path)
if action:
queryset = queryset.filter(action=action)
records = queryset.order_by('-created_at')[:limit]
return [
{
'id': r.id,
'lobster_id': r.lobster_id,
'file_path': r.file_path,
'action': r.action,
'status': r.status,
'old_version': r.old_version,
'new_version': r.new_version,
'old_hash': r.old_hash,
'new_hash': r.new_hash,
'file_size': r.file_size,
'operator': r.operator,
'error_message': r.error_message,
'execution_time': r.execution_time,
'created_at': r.created_at.isoformat(),
}
for r in records
]

View File

@@ -0,0 +1,31 @@
from django.urls import path
from . import views
urlpatterns = [
# 扫描相关
path('scan/', views.scan_files, name='scan_files'),
path('tree/', views.get_file_tree, name='get_file_tree'),
# 同步状态
path('status/', views.check_sync_status, name='check_sync_status'),
# 差异对比
path('diff/', views.get_file_diff, name='get_file_diff'),
# 同步操作
path('sync/db/', views.sync_to_db, name='sync_to_db'),
path('sync/local/', views.sync_to_local, name='sync_to_local'),
# 版本历史
path('versions/', views.get_versions, name='get_versions'),
# 操作历史
path('history/', views.get_history, name='get_history'),
# 统计信息
path('stats/', views.get_stats, name='get_stats'),
# .lobsterignore 管理
path('ignore/patterns/', views.get_ignore_patterns, name='get_ignore_patterns'),
path('ignore/reload/', views.reload_ignore_patterns, name='reload_ignore_patterns'),
]

448
backend/memory_app/views.py Normal file
View File

@@ -0,0 +1,448 @@
from rest_framework.decorators import api_view
from rest_framework.response import Response
from rest_framework import status
from .models import LobsterMemory
from .serializers import LobsterMemorySerializer, FileDiffSerializer
from .services import FileScanner, DiffChecker, AuditLogger
import json
import time
@api_view(['GET'])
def scan_files(request):
"""
扫描本地文件
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
scanner = FileScanner()
files = scanner.scan_directory(lobster_id)
return Response({
'success': True,
'data': files,
'total': len(files)
})
@api_view(['GET'])
def get_file_tree(request):
"""
获取文件树
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
scanner = FileScanner()
tree = scanner.get_file_tree(lobster_id)
return Response({
'success': True,
'data': tree
})
@api_view(['GET'])
def check_sync_status(request):
"""
检查同步状态
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
# 获取本地文件
scanner = FileScanner()
local_files = scanner.scan_directory(lobster_id)
# 获取数据库文件
db_files = list(LobsterMemory.objects.filter(
lobster_id=lobster_id
).values('file_path', 'hash', 'version', 'updated_at'))
# 检查同步状态
checker = DiffChecker()
sync_status = checker.check_sync_status(local_files, db_files)
return Response({
'success': True,
'data': sync_status
})
@api_view(['GET'])
def get_file_diff(request):
"""
获取文件差异(支持大文件优化)
"""
file_path = request.query_params.get('file_path')
lobster_id = request.query_params.get('lobster_id', 'daotong')
chunked = request.query_params.get('chunked', 'false').lower() == 'true'
if not file_path:
return Response({
'success': False,
'error': 'file_path is required'
}, status=status.HTTP_400_BAD_REQUEST)
scanner = FileScanner()
# 获取本地内容(支持分块读取)
try:
local_content, local_hash = scanner.get_file_content(file_path, chunked=chunked)
except FileNotFoundError:
local_content = None
local_hash = None
# 获取数据库内容
try:
db_record = LobsterMemory.objects.filter(
lobster_id=lobster_id,
file_path=file_path
).order_by('-version').first()
if db_record:
db_content = db_record.content
db_hash = db_record.hash
else:
db_content = None
db_hash = None
except Exception as e:
return Response({
'success': False,
'error': str(e)
}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
# 获取差异(支持大文件限制)
checker = DiffChecker()
if local_content and db_content:
diff = checker.get_file_diff(local_content, db_content)
else:
diff = {
'local_lines': local_content.split('\n') if local_content else [],
'db_lines': db_content.split('\n') if db_content else [],
'has_diff': local_content != db_content,
'is_truncated': False
}
# 确定状态
if local_hash == db_hash:
sync_status = 'consistent'
elif local_hash and not db_hash:
sync_status = 'local_newer'
elif not local_hash and db_hash:
sync_status = 'db_newer'
else:
sync_status = 'conflict'
return Response({
'success': True,
'data': {
'file_path': file_path,
'lobster_id': lobster_id,
'local_content': local_content,
'db_content': db_content,
'local_hash': local_hash,
'db_hash': db_hash,
'status': sync_status,
'diff': diff
}
})
@api_view(['POST'])
def sync_to_db(request):
"""
同步到数据库(带操作日志)
"""
lobster_id = request.data.get('lobster_id', 'daotong')
file_path = request.data.get('file_path')
operator = request.data.get('operator', 'system')
if not file_path:
return Response({
'success': False,
'error': 'file_path is required'
}, status=status.HTTP_400_BAD_REQUEST)
scanner = FileScanner()
audit_logger = AuditLogger()
start_time = time.time()
try:
# 读取本地文件
content, file_hash = scanner.get_file_content(file_path)
# 查找现有记录
existing = LobsterMemory.objects.filter(
lobster_id=lobster_id,
file_path=file_path
).order_by('-version').first()
old_version = existing.version if existing else None
old_hash = existing.hash if existing else None
if existing:
# 创建新版本
new_version = existing.version + 1
else:
new_version = 1
# 创建新记录
record = LobsterMemory.objects.create(
lobster_id=lobster_id,
file_path=file_path,
content=content,
hash=file_hash,
status='consistent',
version=new_version,
)
execution_time = time.time() - start_time
# 记录操作日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_db',
old_version=old_version,
new_version=new_version,
old_hash=old_hash,
new_hash=file_hash,
file_size=record.size,
operator=operator,
status='success',
execution_time=execution_time
)
return Response({
'success': True,
'message': '已同步到数据库',
'data': LobsterMemorySerializer(record).data
})
except Exception as e:
execution_time = time.time() - start_time
# 记录失败日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_db',
operator=operator,
status='failed',
error_message=str(e),
execution_time=execution_time
)
return Response({
'success': False,
'error': str(e)
}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
@api_view(['POST'])
def sync_to_local(request):
"""
同步到本地(带操作日志)
"""
lobster_id = request.data.get('lobster_id', 'daotong')
file_path = request.data.get('file_path')
operator = request.data.get('operator', 'system')
if not file_path:
return Response({
'success': False,
'error': 'file_path is required'
}, status=status.HTTP_400_BAD_REQUEST)
scanner = FileScanner()
audit_logger = AuditLogger()
start_time = time.time()
try:
# 从数据库获取最新版本
db_record = LobsterMemory.objects.filter(
lobster_id=lobster_id,
file_path=file_path
).order_by('-version').first()
if not db_record:
return Response({
'success': False,
'error': 'File not found in database'
}, status=status.HTTP_404_NOT_FOUND)
# 获取本地哈希(如果存在)
try:
local_content, local_hash = scanner.get_file_content(file_path)
except FileNotFoundError:
local_hash = None
# 写入本地文件
scanner.write_file(file_path, db_record.content)
execution_time = time.time() - start_time
# 记录操作日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_local',
old_version=None,
new_version=db_record.version,
old_hash=local_hash,
new_hash=db_record.hash,
file_size=db_record.size,
operator=operator,
status='success',
execution_time=execution_time
)
return Response({
'success': True,
'message': '已同步到本地',
'data': LobsterMemorySerializer(db_record).data
})
except Exception as e:
execution_time = time.time() - start_time
# 记录失败日志
audit_logger.log_sync_action(
lobster_id=lobster_id,
file_path=file_path,
action='sync_to_local',
operator=operator,
status='failed',
error_message=str(e),
execution_time=execution_time
)
return Response({
'success': False,
'error': str(e)
}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
@api_view(['GET'])
def get_versions(request):
"""
获取文件的所有版本
"""
file_path = request.query_params.get('file_path')
lobster_id = request.query_params.get('lobster_id', 'daotong')
if not file_path:
return Response({
'success': False,
'error': 'file_path is required'
}, status=status.HTTP_400_BAD_REQUEST)
versions = LobsterMemory.objects.filter(
lobster_id=lobster_id,
file_path=file_path
).order_by('-version')
return Response({
'success': True,
'data': LobsterMemorySerializer(versions, many=True).data
})
@api_view(['GET'])
def get_stats(request):
"""
获取统计信息
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
total_files = LobsterMemory.objects.filter(lobster_id=lobster_id).count()
status_counts = {}
for status_choice, _ in LobsterMemory.STATUS_CHOICES:
count = LobsterMemory.objects.filter(
lobster_id=lobster_id,
status=status_choice
).count()
status_counts[status_choice] = count
# 获取总大小
from django.db.models import Sum
total_size = LobsterMemory.objects.filter(
lobster_id=lobster_id
).aggregate(total=Sum('size'))['total'] or 0
return Response({
'success': True,
'data': {
'total_files': total_files,
'status_counts': status_counts,
'total_size': total_size,
'total_size_mb': round(total_size / 1024 / 1024, 2)
}
})
@api_view(['GET'])
def get_history(request):
"""
获取操作历史
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
file_path = request.query_params.get('file_path')
action = request.query_params.get('action')
limit = int(request.query_params.get('limit', 100))
audit_logger = AuditLogger()
history = audit_logger.get_history(
lobster_id=lobster_id,
file_path=file_path,
action=action,
limit=limit
)
return Response({
'success': True,
'data': history,
'total': len(history)
})
@api_view(['GET'])
def get_ignore_patterns(request):
"""
获取 .lobsterignore 模式列表
"""
lobster_id = request.query_params.get('lobster_id', 'daotong')
scanner = FileScanner()
patterns = scanner.ignore.patterns
return Response({
'success': True,
'data': {
'patterns': patterns,
'total': len(patterns)
}
})
@api_view(['POST'])
def reload_ignore_patterns(request):
"""
重新加载 .lobsterignore 模式
"""
lobster_id = request.data.get('lobster_id', 'daotong')
scanner = FileScanner()
# 重新加载忽略规则
scanner.ignore.load_patterns()
return Response({
'success': True,
'message': '已重新加载忽略规则',
'data': {
'patterns': scanner.ignore.patterns,
'total': len(scanner.ignore.patterns)
}
})

View File

@@ -0,0 +1,101 @@
"""
Django settings for memory_sync project.
"""
from pathlib import Path
import os
BASE_DIR = Path(__file__).resolve().parent.parent
SECRET_KEY = 'django-insecure-dev-key-change-in-production'
DEBUG = True
ALLOWED_HOSTS = ['*']
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'rest_framework',
'corsheaders',
'memory_app',
]
MIDDLEWARE = [
'corsheaders.middleware.CorsMiddleware',
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'memory_sync.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [],
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'memory_sync.wsgi.application'
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.getenv('DB_NAME', 'lobster_memory'),
'USER': os.getenv('DB_USER', 'postgres'),
'PASSWORD': os.getenv('DB_PASSWORD', 'postgres'),
'HOST': os.getenv('DB_HOST', 'localhost'),
'PORT': os.getenv('DB_PORT', '5432'),
}
}
AUTH_PASSWORD_VALIDATORS = []
LANGUAGE_CODE = 'zh-hans'
TIME_ZONE = 'Asia/Shanghai'
USE_I18N = True
USE_TZ = True
STATIC_URL = 'static/'
STATIC_ROOT = BASE_DIR / 'staticfiles'
DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField'
# REST Framework
REST_FRAMEWORK = {
'DEFAULT_PERMISSION_CLASSES': [
'rest_framework.permissions.AllowAny',
],
'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.PageNumberPagination',
'PAGE_SIZE': 100,
}
# CORS
CORS_ALLOW_ALL_ORIGINS = True
# 龙虾记忆目录
LOBSTER_MEMORY_BASE = os.getenv('LOBSTER_MEMORY_BASE', '/home/node/.openclaw/workspace/daotong')
# 支持的文件扩展名
SUPPORTED_EXTENSIONS = ['.md', '.txt', '.json', '.py', '.js', '.yaml', '.yml']

View File

@@ -0,0 +1,7 @@
from django.contrib import admin
from django.urls import path, include
urlpatterns = [
path('admin/', admin.site.urls),
path('api/', include('memory_app.urls')),
]

View File

@@ -0,0 +1,11 @@
"""
WSGI config for memory_sync project.
"""
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'memory_sync.settings')
application = get_wsgi_application()

5
backend/requirements.txt Normal file
View File

@@ -0,0 +1,5 @@
Django>=4.2.0,<5.0.0
djangorestframework>=3.14.0
django-cors-headers>=4.0.0
psycopg2-binary>=2.9.0
python-dotenv>=1.0.0

65
docker-compose.yml Normal file
View File

@@ -0,0 +1,65 @@
version: '3.8'
services:
# PostgreSQL 数据库
postgres:
image: postgres:15-alpine
container_name: lobster-postgres
environment:
POSTGRES_DB: lobster_memory
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
ports:
- "5432:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 10s
timeout: 5s
retries: 5
# Django 后端
backend:
build:
context: ./backend
dockerfile: Dockerfile
container_name: lobster-backend
environment:
DB_HOST: postgres
DB_NAME: lobster_memory
DB_USER: postgres
DB_PASSWORD: postgres
DB_PORT: 5432
LOBSTER_MEMORY_BASE: /app/memory_files
volumes:
# 挂载龙虾记忆目录
- /home/node/.openclaw/workspace/daotong:/app/memory_files:ro
# 代码热重载(开发用)
- ./backend:/app
ports:
- "8087:8087"
depends_on:
postgres:
condition: service_healthy
command: >
sh -c "
python manage.py migrate &&
python manage.py runserver 0.0.0.0:8087
"
# React 前端
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
container_name: lobster-frontend
ports:
- "8086:80"
environment:
- REACT_APP_API_URL=http://localhost:8087/api
depends_on:
- backend
volumes:
postgres_data:

29
frontend/Dockerfile Normal file
View File

@@ -0,0 +1,29 @@
# React 前端 Dockerfile
FROM node:18-alpine as builder
WORKDIR /app
# 复制 package.json
COPY package.json package-lock.json* ./
# 安装依赖
RUN npm ci
# 复制代码
COPY . .
# 构建生产版本
RUN npm run build
# 生产环境镜像
FROM nginx:alpine
# 复制构建产物
COPY --from=builder /app/build /usr/share/nginx/html
# 复制 nginx 配置
COPY nginx.conf /etc/nginx/conf.d/default.conf
EXPOSE 80
CMD ["nginx", "-g", "daemon off;"]

37
frontend/package.json Normal file
View File

@@ -0,0 +1,37 @@
{
"name": "lobster-memory-sync-frontend",
"version": "1.0.0",
"private": true,
"dependencies": {
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-scripts": "5.0.1",
"antd": "^5.0.0",
"react-diff-viewer-continued": "^3.2.6",
"axios": "^1.0.0"
},
"scripts": {
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test",
"eject": "react-scripts eject"
},
"eslintConfig": {
"extends": [
"react-app"
]
},
"browserslist": {
"production": [
">0.2%",
"not dead",
"not op_mini all"
],
"development": [
"last 1 chrome version",
"last 1 firefox version",
"last 1 safari version"
]
},
"proxy": "http://localhost:8087"
}

View File

@@ -0,0 +1,17 @@
<!DOCTYPE html>
<html lang="zh-CN">
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta name="theme-color" content="#000000" />
<meta
name="description"
content="龙虾记忆同步系统 - 管理和同步龙虾的记忆文件"
/>
<title>🦐 龙虾记忆同步系统</title>
</head>
<body>
<noscript>You need to enable JavaScript to run this app.</noscript>
<div id="root"></div>
</body>
</html>

28
frontend/src/App.css Normal file
View File

@@ -0,0 +1,28 @@
.App {
min-height: 100vh;
background: #f0f2f5;
}
.App-header {
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
color: white;
padding: 40px 20px;
text-align: center;
}
.App-header h1 {
margin: 0;
font-size: 32px;
}
.subtitle {
margin: 10px 0 0;
opacity: 0.9;
font-size: 16px;
}
.App-main {
max-width: 1400px;
margin: 0 auto;
padding: 20px;
}

23
frontend/src/App.js Normal file
View File

@@ -0,0 +1,23 @@
import React from 'react';
import { ConfigProvider } from 'antd';
import zhCN from 'antd/locale/zh_CN';
import FileTree from './components/FileTree';
import './App.css';
function App() {
return (
<ConfigProvider locale={zhCN}>
<div className="App">
<header className="App-header">
<h1>🦐 龙虾记忆同步系统</h1>
<p className="subtitle">管理和同步龙虾的记忆文件</p>
</header>
<main className="App-main">
<FileTree />
</main>
</div>
</ConfigProvider>
);
}
export default App;

33
frontend/src/api/index.js Normal file
View File

@@ -0,0 +1,33 @@
import axios from 'axios';
const API_BASE_URL = process.env.REACT_APP_API_URL || 'http://localhost:8087/api';
const api = axios.create({
baseURL: API_BASE_URL,
timeout: 30000,
headers: {
'Content-Type': 'application/json',
},
});
// 请求拦截器
api.interceptors.request.use(
(config) => {
return config;
},
(error) => {
return Promise.reject(error);
}
);
// 响应拦截器
api.interceptors.response.use(
(response) => {
return response.data;
},
(error) => {
return Promise.reject(error);
}
);
export default api;

View File

@@ -0,0 +1,152 @@
import React, { useState, useEffect } from 'react';
import { Spin, Alert, Tabs } from 'antd';
import ReactDiffViewer from 'react-diff-viewer-continued';
import api from '../api';
export default function FileDiff({ filePath, lobsterId }) {
const [loading, setLoading] = useState(false);
const [diffData, setDiffData] = useState(null);
const [error, setError] = useState(null);
const loadDiff = async () => {
setLoading(true);
setError(null);
try {
const response = await api.get('/diff/', {
params: { file_path: filePath, lobster_id: lobsterId }
});
if (response.success) {
setDiffData(response.data);
} else {
setError(response.error || '加载失败');
}
} catch (err) {
setError(err.message || '网络错误');
} finally {
setLoading(false);
}
};
useEffect(() => {
if (filePath) {
loadDiff();
}
}, [filePath]);
if (loading) {
return <Spin tip="加载中..." />;
}
if (error) {
return <Alert message={error} type="error" />;
}
if (!diffData) {
return <Alert message="请选择文件" type="info" />;
}
const { local_content, db_content, status, diff } = diffData;
// 文件不存在的情况
if (!local_content && !db_content) {
return <Alert message="文件不存在" type="warning" />;
}
if (!local_content) {
return (
<Alert
message="文件仅存在于数据库"
description="点击「同步到本地」将文件恢复到本地"
type="info"
showIcon
/>
);
}
if (!db_content) {
return (
<Alert
message="文件仅存在于本地"
description="点击「同步到数据库」将文件备份到数据库"
type="warning"
showIcon
/>
);
}
const STATUS_MESSAGES = {
consistent: '文件内容一致',
local_newer: '本地文件有更新',
db_newer: '数据库版本更新',
conflict: '文件内容冲突',
};
return (
<div>
<Alert
message={STATUS_MESSAGES[status] || '未知状态'}
type={status === 'consistent' ? 'success' : 'warning'}
style={{ marginBottom: 16 }}
showIcon
/>
<Tabs
defaultActiveKey="diff"
items={[
{
key: 'diff',
label: '差异对比',
children: (
<div style={{ overflowX: 'auto' }}>
<ReactDiffViewer
oldValue={db_content || ''}
newValue={local_content || ''}
splitView={true}
useDarkTheme={false}
leftTitle="数据库版本"
rightTitle="本地版本"
/>
</div>
),
},
{
key: 'local',
label: '本地内容',
children: (
<pre style={{
padding: '16px',
background: '#f5f5f5',
borderRadius: '4px',
maxHeight: '500px',
overflow: 'auto',
whiteSpace: 'pre-wrap',
wordBreak: 'break-word'
}}>
{local_content}
</pre>
),
},
{
key: 'db',
label: '数据库内容',
children: (
<pre style={{
padding: '16px',
background: '#f5f5f5',
borderRadius: '4px',
maxHeight: '500px',
overflow: 'auto',
whiteSpace: 'pre-wrap',
wordBreak: 'break-word'
}}>
{db_content}
</pre>
),
},
]}
/>
</div>
);
}

View File

@@ -0,0 +1,273 @@
import React, { useState, useEffect } from 'react';
import { Tree, Button, message, Spin, Alert, Card, Row, Col, Tag } from 'antd';
import {
ReloadOutlined,
SyncOutlined,
FileOutlined,
FolderOutlined,
CheckCircleOutlined,
ExclamationCircleOutlined,
} from '@ant-design/icons';
import api from '../api';
import FileDiff from './FileDiff';
const STATUS_COLORS = {
consistent: 'success',
local_newer: 'warning',
db_newer: 'info',
conflict: 'error',
local_only: 'warning',
db_only: 'info',
};
const STATUS_LABELS = {
consistent: '一致',
local_newer: '本地更新',
db_newer: '数据库更新',
conflict: '冲突',
local_only: '仅本地',
db_only: '仅数据库',
};
export default function FileTree() {
const [loading, setLoading] = useState(false);
const [syncStatus, setSyncStatus] = useState(null);
const [selectedFile, setSelectedFile] = useState(null);
const [stats, setStats] = useState(null);
const lobsterId = 'daotong';
// 加载同步状态
const loadSyncStatus = async () => {
setLoading(true);
try {
const response = await api.get('/status/', { params: { lobster_id: lobsterId } });
setSyncStatus(response.data.data);
// 加载统计信息
const statsResponse = await api.get('/stats/', { params: { lobster_id: lobsterId } });
setStats(statsResponse.data.data);
} catch (error) {
message.error('加载失败: ' + error.message);
} finally {
setLoading(false);
}
};
// 加载文件树
const loadFileTree = async () => {
setLoading(true);
try {
const response = await api.get('/tree/', { params: { lobster_id: lobsterId } });
return response.data.data;
} catch (error) {
message.error('加载失败: ' + error.message);
return null;
} finally {
setLoading(false);
}
};
// 转换为 Ant Design Tree 数据格式
const convertToTreeData = (tree, parentPath = '') => {
const data = [];
for (const [name, children] of Object.entries(tree)) {
const currentPath = parentPath ? `${parentPath}/${name}` : name;
if (children && typeof children === 'object' && !children.file_path) {
// 这是一个目录
data.push({
title: name,
key: currentPath,
icon: <FolderOutlined />,
children: convertToTreeData(children, currentPath),
});
} else if (children && children.file_path) {
// 这是一个文件
const fileStatus = getFileStatus(children.file_path);
data.push({
title: (
<span>
<FileOutlined /> {name}
{fileStatus && (
<Tag color={STATUS_COLORS[fileStatus]} style={{ marginLeft: 8 }}>
{STATUS_LABELS[fileStatus]}
</Tag>
)}
</span>
),
key: children.file_path,
icon: <FileOutlined />,
isLeaf: true,
children: null,
});
}
}
return data;
};
// 获取文件状态
const getFileStatus = (filePath) => {
if (!syncStatus) return null;
for (const status in syncStatus) {
const file = syncStatus[status].find(f => f.file_path === filePath);
if (file) return status;
}
return null;
};
// 处理文件选择
const handleSelect = (keys, info) => {
if (info.node.isLeaf && keys.length > 0) {
const filePath = keys[0];
setSelectedFile(filePath);
}
};
// 同步到数据库
const syncToDb = async (filePath) => {
try {
await api.post('/sync/db/', {
lobster_id: lobsterId,
file_path: filePath,
});
message.success('已同步到数据库');
loadSyncStatus();
} catch (error) {
message.error('同步失败: ' + error.message);
}
};
// 同步到本地
const syncToLocal = async (filePath) => {
try {
await api.post('/sync/local/', {
lobster_id: lobsterId,
file_path: filePath,
});
message.success('已同步到本地');
loadSyncStatus();
} catch (error) {
message.error('同步失败: ' + error.message);
}
};
useEffect(() => {
loadSyncStatus();
}, []);
const [treeData, setTreeData] = useState([]);
useEffect(() => {
loadFileTree().then(data => {
if (data) {
setTreeData(convertToTreeData(data));
}
});
}, []);
return (
<div style={{ padding: '20px' }}>
<Row gutter={[16, 16]}>
<Col span={24}>
<Card
title="记忆同步"
extra={
<Button
type="primary"
icon={<ReloadOutlined />}
onClick={loadSyncStatus}
loading={loading}
>
刷新状态
</Button>
}
>
{stats && (
<Row gutter={16}>
<Col span={6}>
<Statistic title="总文件数" value={stats.total_files} />
</Col>
<Col span={6}>
<Statistic title="总大小" value={stats.total_size_mb} suffix="MB" />
</Col>
{stats.status_counts.conflict > 0 && (
<Col span={12}>
<Alert
message={`${stats.status_counts.conflict} 个文件冲突`}
type="error"
showIcon
/>
</Col>
)}
</Row>
)}
</Card>
</Col>
<Col span={10}>
<Card title="文件树">
<Spin spinning={loading}>
{treeData.length > 0 ? (
<Tree
showLine
treeData={treeData}
onSelect={handleSelect}
/>
) : (
<Alert message="暂无文件" type="info" />
)}
</Spin>
</Card>
</Col>
<Col span={14}>
<Card
title={selectedFile ? `文件对比: ${selectedFile}` : '文件对比'}
extra={
selectedFile && (
<Button.Group>
<Button
icon={<SyncOutlined />}
onClick={() => syncToLocal(selectedFile)}
>
同步到本地
</Button>
<Button
type="primary"
icon={<SyncOutlined />}
onClick={() => syncToDb(selectedFile)}
>
同步到数据库
</Button>
</Button.Group>
)
}
>
{selectedFile ? (
<FileDiff filePath={selectedFile} lobsterId={lobsterId} />
) : (
<Alert message="请选择文件查看差异" type="info" />
)}
</Card>
</Col>
</Row>
</div>
);
}
function Statistic({ title, value, suffix }) {
return (
<div>
<div style={{ fontSize: '14px', color: '#666' }}>{title}</div>
<div style={{ fontSize: '24px', fontWeight: 'bold', color: '#1890ff' }}>
{value}
{suffix && <span style={{ fontSize: '14px', marginLeft: '4px' }}>{suffix}</span>}
</div>
</div>
);
}

13
frontend/src/index.css Normal file
View File

@@ -0,0 +1,13 @@
body {
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
code {
font-family: source-code-pro, Menlo, Monaco, Consolas, 'Courier New',
monospace;
}

11
frontend/src/index.js Normal file
View File

@@ -0,0 +1,11 @@
import React from 'react';
import ReactDOM from 'react-dom/client';
import './index.css';
import App from './App';
const root = ReactDOM.createRoot(document.getElementById('root'));
root.render(
<React.StrictMode>
<App />
</React.StrictMode>
);