docs: make README easier to follow and update related docs (#884)

This commit is contained in:
Zhiyunyao
2026-02-21 07:48:20 +08:00
committed by GitHub
parent 0d7c0826f0
commit 75226b2fe6
8 changed files with 130 additions and 99 deletions

View File

@@ -25,10 +25,6 @@ Docker provides a consistent, isolated environment with all dependencies pre-con
# Set your API keys
export OPENAI_API_KEY="your-key-here"
# or edit config.yaml directly
# Optional: Enable MCP servers and skills
cp extensions_config.example.json extensions_config.json
# Edit extensions_config.json to enable desired MCP servers and skills
```
2. **Initialize Docker environment** (first time only):
@@ -58,17 +54,18 @@ Docker provides a consistent, isolated environment with all dependencies pre-con
#### Docker Commands
```bash
# View all logs
make docker-logs
# Restart services
make docker-restart
# Stop services
# Build the custom k3s image (with pre-cached sandbox image)
make docker-init
# Start all services in Docker (localhost:2026)
make docker-start
# Stop Docker development services
make docker-stop
# Get help
make docker-help
# View Docker development logs
make docker-logs
# View Docker frontend logs
make docker-logs-frontend
# View Docker gateway logs
make docker-logs-gateway
```
#### Docker Architecture

View File

@@ -23,7 +23,6 @@ config:
@test -f config.yaml || cp config.example.yaml config.yaml
@test -f .env || cp .env.example .env
@test -f frontend/.env || cp frontend/.env.example frontend/.env
@test -f extensions_config.json || cp extensions_config.example.json extensions_config.json
# Check required tools
check:

View File

@@ -18,7 +18,7 @@ Learn more and see **real demos** on our official website.
## Table of Contents
- [Quick Start](#quick-start)
- [Sandbox Configuration](#sandbox-configuration)
- [Sandbox Mode](#sandbox-mode)
- [From Deep Research to Super Agent Harness](#from-deep-research-to-super-agent-harness)
- [Core Features](#core-features)
- [Skills & Tools](#skills--tools)
@@ -37,51 +37,64 @@ Learn more and see **real demos** on our official website.
### Configuration
1. Clone the git repo of DeerFlow:
1. **Clone the DeerFlow repository**
```bash
git clone https://github.com/bytedance/deer-flow.git && cd deer-flow
git clone https://github.com/bytedance/deer-flow.git
cd deer-flow
```
2. Create local config files by copying the example files:
2. **Generate local configuration files**
From the project root directory (`deer-flow/`), run:
```bash
make config
```
3. Update the configs:
This command creates local configuration files based on the provided example templates.
- **Required**
- `config.yaml`: configure your preferred models.
- `.env`: configure your API keys.
- **Optional**
- `frontend/.env`: configure backend API URLs.
- `extensions_config.json`: configure desired MCP servers and skills.
3. **Configure your preferred model(s)**
#### Sandbox Configuration
Edit `config.yaml` and define at least one model:
DeerFlow supports multiple sandbox execution modes. Configure your preferred mode in `config.yaml`:
```yaml
models:
- name: gpt-4 # Internal identifier
display_name: GPT-4 # Human-readable name
use: langchain_openai:ChatOpenAI # LangChain class path
model: gpt-4 # Model identifier for API
api_key: $OPENAI_API_KEY # API key (recommended: use env var)
max_tokens: 4096 # Maximum tokens per request
temperature: 0.7 # Sampling temperature
```
**Local Execution** (runs sandbox code directly on the host machine):
```yaml
sandbox:
use: src.sandbox.local:LocalSandboxProvider # Local execution
```
4. **Set API keys for your configured model(s)**
**Docker Execution** (runs sandbox code in isolated Docker containers):
```yaml
sandbox:
use: src.community.aio_sandbox:AioSandboxProvider # Docker-based sandbox
```
Choose one of the following methods:
**Docker Execution with Kubernetes** (runs sandbox code in Kubernetes pods via provisioner service):
- Option A: Edit the `.env` file in the project root (Recommended)
This mode runs each sandbox in an isolated Kubernetes Pod on your **host machine's cluster**. Requires Docker Desktop K8s, OrbStack, or similar local K8s setup.
```yaml
sandbox:
use: src.community.aio_sandbox:AioSandboxProvider
provisioner_url: http://provisioner:8002
```
```bash
TAVILY_API_KEY=your-tavily-api-key
OPENAI_API_KEY=your-openai-api-key
# Add other provider keys as needed
```
See [Provisioner Setup Guide](docker/provisioner/README.md) for detailed configuration, prerequisites, and troubleshooting.
- Option B: Export environment variables in your shell
```bash
export OPENAI_API_KEY=your-openai-api-key
```
- Option C: Edit `config.yaml` directly (Not recommended for production)
```yaml
models:
- name: gpt-4
api_key: your-actual-api-key-here # Replace placeholder
```
### Running the Application
@@ -121,6 +134,21 @@ If you prefer running services locally:
4. **Access**: http://localhost:2026
### Advanced
#### Sandbox Mode
DeerFlow supports multiple sandbox execution modes:
- **Local Execution** (runs sandbox code directly on the host machine)
- **Docker Execution** (runs sandbox code in isolated Docker containers)
- **Docker Execution with Kubernetes** (runs sandbox code in Kubernetes pods via provisioner service)
See the [Sandbox Configuration Guide](backend/docs/CONFIGURATION.md#sandbox) to configure your preferred mode.
#### MCP Server
DeerFlow supports configurable MCP servers and skills to extend its capabilities.
See the [MCP Server Guide](backend/docs/MCP_SERVER.md) for detailed instructions.
## From Deep Research to Super Agent Harness
DeerFlow started as a Deep Research framework — and the community ran with it. Since launch, developers have pushed it far beyond research: building data pipelines, generating slide decks, spinning up dashboards, automating content workflows. Things we never anticipated.

View File

@@ -38,7 +38,6 @@ Thank you for your interest in contributing to DeerFlow! This document provides
```bash
# From project root
cp config.example.yaml config.yaml
cp extensions_config.example.json extensions_config.json
# Install backend dependencies
cd backend

View File

@@ -143,7 +143,6 @@ cd deer-flow
# Copy configuration files
cp config.example.yaml config.yaml
cp extensions_config.example.json extensions_config.json
# Install backend dependencies
cd backend

View File

@@ -2,35 +2,6 @@
This guide explains how to configure DeerFlow for your environment.
## Quick Start
1. **Copy the example configuration** (from project root):
```bash
# From project root directory (deer-flow/)
cp config.example.yaml config.yaml
```
2. **Set your API keys**:
Option A: Use environment variables (recommended):
```bash
export OPENAI_API_KEY="your-api-key-here"
export ANTHROPIC_API_KEY="your-api-key-here"
# Add other keys as needed
```
Option B: Edit `config.yaml` directly (not recommended for production):
```yaml
models:
- name: gpt-4
api_key: your-actual-api-key-here # Replace placeholder
```
3. **Start the application**:
```bash
make dev
```
## Configuration Sections
### Models
@@ -103,6 +74,32 @@ tools:
### Sandbox
DeerFlow supports multiple sandbox execution modes. Configure your preferred mode in `config.yaml`:
**Local Execution** (runs sandbox code directly on the host machine):
```yaml
sandbox:
use: src.sandbox.local:LocalSandboxProvider # Local execution
```
**Docker Execution** (runs sandbox code in isolated Docker containers):
```yaml
sandbox:
use: src.community.aio_sandbox:AioSandboxProvider # Docker-based sandbox
```
**Docker Execution with Kubernetes** (runs sandbox code in Kubernetes pods via provisioner service):
This mode runs each sandbox in an isolated Kubernetes Pod on your **host machine's cluster**. Requires Docker Desktop K8s, OrbStack, or similar local K8s setup.
```yaml
sandbox:
use: src.community.aio_sandbox:AioSandboxProvider
provisioner_url: http://provisioner:8002
```
See [Provisioner Setup Guide](docker/provisioner/README.md) for detailed configuration, prerequisites, and troubleshooting.
Choose between local execution or Docker-based isolation:
**Option 1: Local Sandbox** (default, simpler setup):

View File

@@ -0,0 +1,34 @@
# MCP (Model Context Protocol) Configuration
DeerFlow supports configurable MCP servers and skills to extend its capabilities, which are loaded from a dedicated `extensions_config.json` file in the project root directory.
## Setup
1. Copy `extensions_config.example.json` to `extensions_config.json` in the project root directory.
```bash
# Copy example configuration
cp extensions_config.example.json extensions_config.json
```
2. Enable the desired MCP servers or skills by setting `"enabled": true`.
3. Configure each servers command, arguments, and environment variables as needed.
4. Restart the application to load and register MCP tools.
## How It Works
MCP servers expose tools that are automatically discovered and integrated into DeerFlows agent system at runtime. Once enabled, these tools become available to agents without additional code changes.
## Example Capabilities
MCP servers can provide access to:
- **File systems**
- **Databases** (e.g., PostgreSQL)
- **External APIs** (e.g., GitHub, Brave Search)
- **Browser automation** (e.g., Puppeteer)
- **Custom MCP server implementations**
## Learn More
For detailed documentation about the Model Context Protocol, visit:
https://modelcontextprotocol.io

View File

@@ -267,28 +267,6 @@ summarization:
# The prompt should guide the model to extract important context
summary_prompt: null
# ============================================================================
# MCP (Model Context Protocol) Configuration
# ============================================================================
# Configure MCP servers to provide additional tools and capabilities
# MCP configuration is loaded from a separate `mcp_config.json` file
#
# Setup:
# 1. Copy `mcp_config.example.json` to `mcp_config.json` in the project root
# 2. Enable desired MCP servers by setting `enabled: true`
# 3. Configure server commands, arguments, and environment variables
# 4. Restart the application to load MCP tools
#
# MCP servers provide tools that are automatically discovered and integrated
# with DeerFlow's agent system. Examples include:
# - File system access
# - Database connections (PostgreSQL, etc.)
# - External APIs (GitHub, Brave Search, etc.)
# - Browser automation (Puppeteer)
# - Custom MCP server implementations
#
# For more information, see: https://modelcontextprotocol.io
# ============================================================================
# Memory Configuration
# ============================================================================