mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-11 09:44:44 +08:00
- Add AioSandboxProvider for Docker-based sandbox execution with configurable container lifecycle, volume mounts, and port management - Add TitleMiddleware to auto-generate thread titles after first user-assistant exchange using LLM - Add Claude Code documentation (CLAUDE.md, AGENTS.md) - Extend SandboxConfig with Docker-specific options (image, port, mounts) - Fix hardcoded mount path to use expanduser - Add agent-sandbox and dotenv dependencies Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
2.7 KiB
2.7 KiB
CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
Project Overview
DeerFlow is a LangGraph-based AI agent backend that provides a "super agent" with sandbox execution capabilities. The agent can execute code, browse the web, and manage files in isolated sandbox environments.
Commands
# Install dependencies
make install
# Run development server (LangGraph Studio)
make dev
# Lint
make lint
# Format code
make format
Architecture
Configuration System
The app uses a YAML-based configuration system loaded from config.yaml. Configuration priority:
- Explicit
config_pathargument DEER_FLOW_CONFIG_PATHenvironment variableconfig.yamlin current directoryconfig.yamlin parent directory
Config values starting with $ are resolved as environment variables (e.g., $OPENAI_API_KEY).
Core Components
Agent Graph (src/agents/)
lead_agentis the main entry point registered inlanggraph.json- Uses
ThreadStatewhich extendsAgentStatewith sandbox state - Agent is created via
create_agent()with model, tools, middleware, and system prompt
Sandbox System (src/sandbox/)
- Abstract
Sandboxbase class defines interface:execute_command,read_file,write_file,list_dir SandboxProvidermanages sandbox lifecycle:acquire,get,releaseSandboxMiddlewareautomatically acquires sandbox on agent start and injects into stateLocalSandboxProvideris a singleton implementation for local execution- Sandbox tools (
bash,ls,read_file,write_file,str_replace) extract sandbox from tool runtime
Model Factory (src/models/)
create_chat_model()instantiates LLM from config using reflection- Supports
thinking_enabledflag with per-modelwhen_thinking_enabledoverrides
Tool System (src/tools/)
- Tools defined in config with
usepath (e.g.,src.sandbox.tools:bash_tool) get_available_tools()resolves tool paths via reflection- Community tools in
src/community/: Jina AI (web fetch), Tavily (web search)
Reflection System (src/reflection/)
resolve_variable()imports module and returns variable (e.g.,module:variable)resolve_class()imports and validates class against base class
Config Schema
Models, tools, and sandbox providers are configured in config.yaml:
models[]: LLM configurations withuseclass pathtools[]: Tool configurations withusevariable path andgroupsandbox.use: Sandbox provider class path
Code Style
- Uses
rufffor linting and formatting - Line length: 240 characters
- Python 3.12+ with type hints
- Double quotes, space indentation