- 添加 rehype-raw 依赖以支持在 markdown 中渲染 HTML Add rehype-raw dependency to support HTML rendering in markdown - 重构 memory-settings-page,提取 formatMemorySection 函数减少重复代码 Refactor memory-settings-page by extracting formatMemorySection function to reduce code duplication - 改进空状态显示,使用 HTML span 标签替代 markdown 斜体,提供更好的样式控制 Improve empty state display by using HTML span tags instead of markdown italics for better style control - 为 skill-settings-page 添加完整的国际化支持,替换硬编码的英文文本 Add complete i18n support for skill-settings-page, replacing hardcoded English text - 更新国际化文件,添加技能设置页面的空状态文本(中英文) Update i18n files with empty state text for skill settings page (both Chinese and English) - 在 streamdown 插件配置中添加 rehypeRaw 以支持 HTML 渲染 Add rehypeRaw to streamdown plugins configuration to support HTML rendering Co-authored-by: Cursor <cursoragent@cursor.com>
🦌 DeerFlow - v2
Originated from Open Source, give back to Open Source.
A LangGraph-based AI agent backend with sandbox execution capabilities.
Quick Start
Option 1: Docker (Recommended)
The fastest way to get started with a consistent environment:
-
Configure the application:
cp config.example.yaml config.yaml # Edit config.yaml and set your API keys -
Initialize and start:
make docker-start # Start all services -
Access: http://localhost:2026
See CONTRIBUTING.md for detailed Docker development guide.
Option 2: Local Development
If you prefer running services locally:
-
Check prerequisites:
make check # Verifies Node.js 22+, pnpm, uv, nginx -
Configure and install:
cp config.example.yaml config.yaml make install -
(Optional) Pre-pull sandbox image:
# Recommended if using Docker/Container-based sandbox make setup-sandbox -
Start services:
make dev -
Access: http://localhost:2026
See CONTRIBUTING.md for detailed local development guide.
Sandbox Configuration
DeerFlow supports multiple sandbox execution modes. Configure your preferred mode in config.yaml:
Local Execution (runs sandbox code directly on the host machine):
sandbox:
use: src.sandbox.local:LocalSandboxProvider # Local execution
Docker Execution (runs sandbox code in isolated Docker containers):
sandbox:
use: src.community.aio_sandbox:AioSandboxProvider # Docker-based sandbox
Docker Execution with Kubernetes (runs sandbox code in Kubernetes pods):
Setup Kubernetes sandbox as per Kubernetes Sandbox Setup.
./docker/k8s/setup.sh
Then configure config.yaml with the Kubernetes service URL:
sandbox:
use: src.community.k8s_sandbox:AioSandboxProvider # Kubernetes-based sandbox
base_url: http://deer-flow-sandbox.deer-flow.svc.cluster.local:8080 # Kubernetes service URL
Features
- 🤖 LangGraph-based Agents - Multi-agent orchestration with sophisticated workflows
- 🧠 Persistent Memory - LLM-powered context retention across conversations with automatic fact extraction
- 🔧 Model Context Protocol (MCP) - Extensible tool integration
- 🎯 Skills System - Reusable agent capabilities
- 🛡️ Sandbox Execution - Safe code execution environment
- 🌐 Unified API Gateway - Single entry point with nginx reverse proxy
- 🔄 Hot Reload - Fast development iteration
- 📊 Real-time Streaming - Server-Sent Events (SSE) support
Documentation
- Contributing Guide - Development environment setup and workflow
- Configuration Guide - Setup and configuration instructions
- Architecture Overview - Technical architecture details
- Backend Architecture - Backend architecture and API reference
Contributing
We welcome contributions! Please see CONTRIBUTING.md for development setup, workflow, and guidelines.
License
This project is open source and available under the MIT License.
Acknowledgments
DeerFlow is built upon the incredible work of the open-source community. We are deeply grateful to all the projects and contributors whose efforts have made DeerFlow possible. Truly, we stand on the shoulders of giants.
We would like to extend our sincere appreciation to the following projects for their invaluable contributions:
- LangChain: Their exceptional framework powers our LLM interactions and chains, enabling seamless integration and functionality.
- LangGraph: Their innovative approach to multi-agent orchestration has been instrumental in enabling DeerFlow's sophisticated workflows.
These projects exemplify the transformative power of open-source collaboration, and we are proud to build upon their foundations.
Key Contributors
A heartfelt thank you goes out to the core authors of DeerFlow, whose vision, passion, and dedication have brought this project to life:
Your unwavering commitment and expertise have been the driving force behind DeerFlow's success. We are honored to have you at the helm of this journey.