mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-03 06:12:14 +08:00
docs: add comprehensive debugging guide and improve troubleshooting documentation (#688)
This commit addresses issue #682 by providing clear documentation on how to view complete model output and debug DeerFlow workflows. Changes: - Add new DEBUGGING.md guide with detailed instructions for: - Viewing complete model output - Enabling debug logging - Configuring LangChain verbose logging - Setting up LangSmith tracing - Docker Compose debugging tips - Common troubleshooting scenarios - Update .env.example with: - Clearer comments for DEBUG setting - Documentation for LANGCHAIN_VERBOSE and LANGCHAIN_DEBUG options - Improved LangSmith configuration guidance - Enhance docs/FAQ.md with: - How to view complete model output - How to enable debug logging - How to troubleshoot common issues - Links to the new debugging guide These documentation improvements make it easier for users to: - Debug workflow issues - View LLM prompts and responses - Troubleshoot deployment problems - Monitor performance with LangSmith Fixes #682
This commit is contained in:
12
.env.example
12
.env.example
@@ -1,4 +1,6 @@
|
||||
# Application Settings
|
||||
# Set to True to enable debug-level logging (shows detailed LLM prompts and responses)
|
||||
# Recommended for development and troubleshooting
|
||||
DEBUG=True
|
||||
APP_ENV=development
|
||||
|
||||
@@ -82,12 +84,20 @@ VOLCENGINE_TTS_ACCESS_TOKEN=xxx
|
||||
# VOLCENGINE_TTS_CLUSTER=volcano_tts # Optional, default is volcano_tts
|
||||
# VOLCENGINE_TTS_VOICE_TYPE=BV700_V2_streaming # Optional, default is BV700_V2_streaming
|
||||
|
||||
# Option, for langsmith tracing and monitoring
|
||||
# Optional, for langsmith tracing and monitoring
|
||||
# Highly recommended for production debugging and performance monitoring
|
||||
# Get your API key from https://smith.langchain.com/
|
||||
# LANGSMITH_TRACING=true
|
||||
# LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
|
||||
# LANGSMITH_API_KEY="xxx"
|
||||
# LANGSMITH_PROJECT="xxx"
|
||||
|
||||
# Optional, LangChain verbose logging
|
||||
# Enable these to see detailed LLM interactions in console/logs
|
||||
# Useful for debugging but can be very verbose
|
||||
# LANGCHAIN_VERBOSE=true
|
||||
# LANGCHAIN_DEBUG=true
|
||||
|
||||
# [!NOTE]
|
||||
# For model settings and other configurations, please refer to `docs/configuration_guide.md`
|
||||
|
||||
|
||||
317
docs/DEBUGGING.md
Normal file
317
docs/DEBUGGING.md
Normal file
@@ -0,0 +1,317 @@
|
||||
# Debugging Guide
|
||||
|
||||
This guide helps you debug DeerFlow workflows, view model outputs, and troubleshoot common issues.
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Viewing Model Output](#viewing-model-output)
|
||||
- [Debug Logging Configuration](#debug-logging-configuration)
|
||||
- [LangChain Verbose Logging](#langchain-verbose-logging)
|
||||
- [LangSmith Tracing](#langsmith-tracing)
|
||||
- [Docker Compose Debugging](#docker-compose-debugging)
|
||||
- [Common Issues](#common-issues)
|
||||
|
||||
## Viewing Model Output
|
||||
|
||||
When you need to see the complete model output, including tool calls and internal reasoning, you have several options:
|
||||
|
||||
### 1. Enable Debug Logging
|
||||
|
||||
Set `DEBUG=True` in your `.env` file or configuration:
|
||||
|
||||
```bash
|
||||
DEBUG=True
|
||||
```
|
||||
|
||||
This enables debug-level logging throughout the application, showing detailed information about:
|
||||
- System prompts sent to LLMs
|
||||
- Model responses
|
||||
- Tool calls and results
|
||||
- Workflow state transitions
|
||||
|
||||
### 2. Enable LangChain Verbose Logging
|
||||
|
||||
Add these environment variables to your `.env` file for detailed LangChain output:
|
||||
|
||||
```bash
|
||||
# Enable verbose logging for LangChain
|
||||
LANGCHAIN_VERBOSE=true
|
||||
LANGCHAIN_DEBUG=true
|
||||
```
|
||||
|
||||
This will show:
|
||||
- Chain execution steps
|
||||
- LLM input/output for each call
|
||||
- Tool invocations
|
||||
- Intermediate results
|
||||
|
||||
### 3. Enable LangSmith Tracing (Recommended for Production)
|
||||
|
||||
For advanced debugging and visualization, configure LangSmith integration:
|
||||
|
||||
```bash
|
||||
LANGSMITH_TRACING=true
|
||||
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
|
||||
LANGSMITH_API_KEY="your-api-key"
|
||||
LANGSMITH_PROJECT="your-project-name"
|
||||
```
|
||||
|
||||
LangSmith provides:
|
||||
- Visual trace of workflow execution
|
||||
- Performance metrics
|
||||
- Token usage statistics
|
||||
- Error tracking
|
||||
- Comparison between runs
|
||||
|
||||
To get started with LangSmith:
|
||||
1. Sign up at [LangSmith](https://smith.langchain.com/)
|
||||
2. Create a project
|
||||
3. Copy your API key
|
||||
4. Add the configuration to your `.env` file
|
||||
|
||||
## Debug Logging Configuration
|
||||
|
||||
### Log Levels
|
||||
|
||||
DeerFlow uses Python's standard logging levels:
|
||||
|
||||
- **DEBUG**: Detailed diagnostic information
|
||||
- **INFO**: General informational messages
|
||||
- **WARNING**: Warning messages
|
||||
- **ERROR**: Error messages
|
||||
- **CRITICAL**: Critical errors
|
||||
|
||||
### Viewing Logs
|
||||
|
||||
**Development mode (console):**
|
||||
```bash
|
||||
uv run main.py
|
||||
```
|
||||
|
||||
Logs will be printed to the console.
|
||||
|
||||
**Docker Compose:**
|
||||
```bash
|
||||
# View logs from all services
|
||||
docker compose logs -f
|
||||
|
||||
# View logs from backend only
|
||||
docker compose logs -f backend
|
||||
|
||||
# View logs with timestamps
|
||||
docker compose logs -f --timestamps
|
||||
```
|
||||
|
||||
## LangChain Verbose Logging
|
||||
|
||||
### What It Shows
|
||||
|
||||
When `LANGCHAIN_VERBOSE=true` is enabled, you'll see output like:
|
||||
|
||||
```
|
||||
> Entering new AgentExecutor chain...
|
||||
Thought: I need to search for information about quantum computing
|
||||
Action: web_search
|
||||
Action Input: "quantum computing basics 2024"
|
||||
|
||||
Observation: [Search results...]
|
||||
|
||||
Thought: I now have enough information to answer
|
||||
Final Answer: ...
|
||||
```
|
||||
|
||||
### Configuration Options
|
||||
|
||||
```bash
|
||||
# Basic verbose mode
|
||||
LANGCHAIN_VERBOSE=true
|
||||
|
||||
# Full debug mode with internal details
|
||||
LANGCHAIN_DEBUG=true
|
||||
|
||||
# Both (recommended for debugging)
|
||||
LANGCHAIN_VERBOSE=true
|
||||
LANGCHAIN_DEBUG=true
|
||||
```
|
||||
|
||||
## LangSmith Tracing
|
||||
|
||||
### Setup
|
||||
|
||||
1. **Create a LangSmith account**: Visit [smith.langchain.com](https://smith.langchain.com)
|
||||
|
||||
2. **Get your API key**: Navigate to Settings → API Keys
|
||||
|
||||
3. **Configure environment variables**:
|
||||
```bash
|
||||
LANGSMITH_TRACING=true
|
||||
LANGSMITH_ENDPOINT="https://api.smith.langchain.com"
|
||||
LANGSMITH_API_KEY="lsv2_pt_..."
|
||||
LANGSMITH_PROJECT="deerflow-debug"
|
||||
```
|
||||
|
||||
4. **Restart your application**
|
||||
|
||||
### Features
|
||||
|
||||
- **Visual traces**: See the entire workflow execution as a graph
|
||||
- **Performance metrics**: Identify slow operations
|
||||
- **Token tracking**: Monitor LLM token usage
|
||||
- **Error analysis**: Quickly identify failures
|
||||
- **Comparison**: Compare different runs side-by-side
|
||||
|
||||
### Viewing Traces
|
||||
|
||||
1. Run your workflow as normal
|
||||
2. Visit [smith.langchain.com](https://smith.langchain.com)
|
||||
3. Select your project
|
||||
4. View traces in the "Traces" tab
|
||||
|
||||
## Docker Compose Debugging
|
||||
|
||||
### Update docker-compose.yml
|
||||
|
||||
Add debug environment variables to your `docker-compose.yml`:
|
||||
|
||||
```yaml
|
||||
services:
|
||||
backend:
|
||||
build:
|
||||
context: .
|
||||
dockerfile: Dockerfile
|
||||
environment:
|
||||
# Debug settings
|
||||
- DEBUG=True
|
||||
- LANGCHAIN_VERBOSE=true
|
||||
- LANGCHAIN_DEBUG=true
|
||||
|
||||
# LangSmith (optional)
|
||||
- LANGSMITH_TRACING=true
|
||||
- LANGSMITH_ENDPOINT=https://api.smith.langchain.com
|
||||
- LANGSMITH_API_KEY=${LANGSMITH_API_KEY}
|
||||
- LANGSMITH_PROJECT=${LANGSMITH_PROJECT}
|
||||
```
|
||||
|
||||
### View Detailed Logs
|
||||
|
||||
```bash
|
||||
# Start with verbose output
|
||||
docker compose up
|
||||
|
||||
# Or in detached mode and follow logs
|
||||
docker compose up -d
|
||||
docker compose logs -f backend
|
||||
```
|
||||
|
||||
### Common Docker Commands
|
||||
|
||||
```bash
|
||||
# View logs from last 100 lines
|
||||
docker compose logs --tail=100 backend
|
||||
|
||||
# View logs with timestamps
|
||||
docker compose logs -f --timestamps
|
||||
|
||||
# Check container status
|
||||
docker compose ps
|
||||
|
||||
# Restart services
|
||||
docker compose restart backend
|
||||
```
|
||||
|
||||
## Common Issues
|
||||
|
||||
### Issue: "Log information doesn't show complete content"
|
||||
|
||||
**Solution**: Enable debug logging as described above:
|
||||
```bash
|
||||
DEBUG=True
|
||||
LANGCHAIN_VERBOSE=true
|
||||
LANGCHAIN_DEBUG=true
|
||||
```
|
||||
|
||||
### Issue: "Can't see system prompts"
|
||||
|
||||
**Solution**: Debug logging will show system prompts. Look for log entries like:
|
||||
```
|
||||
[INFO] System Prompt:
|
||||
You are DeerFlow, a friendly AI assistant...
|
||||
```
|
||||
|
||||
### Issue: "Want to see token usage"
|
||||
|
||||
**Solution**: Enable LangSmith tracing or check model responses in verbose mode:
|
||||
```bash
|
||||
LANGCHAIN_VERBOSE=true
|
||||
```
|
||||
|
||||
### Issue: "Need to debug specific nodes"
|
||||
|
||||
**Solution**: Add custom logging in specific nodes. For example, in `src/graph/nodes.py`:
|
||||
```python
|
||||
import logging
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def my_node(state, config):
|
||||
logger.debug(f"Node input: {state}")
|
||||
# ... your code ...
|
||||
logger.debug(f"Node output: {result}")
|
||||
return result
|
||||
```
|
||||
|
||||
### Issue: "Logs are too verbose"
|
||||
|
||||
**Solution**: Adjust log level for specific modules:
|
||||
```python
|
||||
# In your code
|
||||
logging.getLogger('langchain').setLevel(logging.WARNING)
|
||||
logging.getLogger('openai').setLevel(logging.WARNING)
|
||||
```
|
||||
|
||||
## Performance Debugging
|
||||
|
||||
### Measure Execution Time
|
||||
|
||||
Enable LangSmith or add timing logs:
|
||||
|
||||
```python
|
||||
import time
|
||||
start = time.time()
|
||||
result = some_function()
|
||||
logger.info(f"Execution time: {time.time() - start:.2f}s")
|
||||
```
|
||||
|
||||
### Monitor Token Usage
|
||||
|
||||
With LangSmith enabled, token usage is automatically tracked. Alternatively, check model responses:
|
||||
|
||||
```bash
|
||||
LANGCHAIN_VERBOSE=true
|
||||
```
|
||||
|
||||
Look for output like:
|
||||
```
|
||||
Tokens Used: 150
|
||||
Prompt Tokens: 100
|
||||
Completion Tokens: 50
|
||||
```
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- [LangSmith Documentation](https://docs.smith.langchain.com/)
|
||||
- [LangGraph Debugging](https://langchain-ai.github.io/langgraph/how-tos/debugging/)
|
||||
- [Configuration Guide](./configuration_guide.md)
|
||||
- [API Documentation](./API.md)
|
||||
|
||||
## Getting Help
|
||||
|
||||
If you're still experiencing issues:
|
||||
|
||||
1. Check existing [GitHub Issues](https://github.com/bytedance/deer-flow/issues)
|
||||
2. Enable debug logging and LangSmith tracing
|
||||
3. Collect relevant log output
|
||||
4. Create a new issue with:
|
||||
- Description of the problem
|
||||
- Steps to reproduce
|
||||
- Log output
|
||||
- Configuration (without sensitive data)
|
||||
81
docs/FAQ.md
81
docs/FAQ.md
@@ -3,8 +3,10 @@
|
||||
## Table of Contents
|
||||
|
||||
- [Where's the name DeerFlow come from?](#wheres-the-name-deerflow-come-from)
|
||||
|
||||
- [Which models does DeerFlow support?](#which-models-does-deerflow-support)
|
||||
- [How do I view complete model output?](#how-do-i-view-complete-model-output)
|
||||
- [How do I enable debug logging?](#how-do-i-enable-debug-logging)
|
||||
- [How do I troubleshoot issues?](#how-do-i-troubleshoot-issues)
|
||||
|
||||
## Where's the name DeerFlow come from?
|
||||
|
||||
@@ -13,3 +15,80 @@ DeerFlow is short for **D**eep **E**xploration and **E**fficient **R**esearch **
|
||||
## Which models does DeerFlow support?
|
||||
|
||||
Please refer to the [Configuration Guide](configuration_guide.md) for more details.
|
||||
|
||||
## How do I view complete model output?
|
||||
|
||||
If you want to see the complete model output, including system prompts, tool calls, and LLM responses:
|
||||
|
||||
1. **Enable debug logging** by setting `DEBUG=True` in your `.env` file
|
||||
|
||||
2. **Enable LangChain verbose logging** by adding these to your `.env`:
|
||||
|
||||
```bash
|
||||
LANGCHAIN_VERBOSE=true
|
||||
LANGCHAIN_DEBUG=true
|
||||
```
|
||||
|
||||
3. **Use LangSmith tracing** for visual debugging (recommended for production):
|
||||
|
||||
```bash
|
||||
LANGSMITH_TRACING=true
|
||||
LANGSMITH_API_KEY="your-api-key"
|
||||
LANGSMITH_PROJECT="your-project-name"
|
||||
```
|
||||
|
||||
For detailed instructions, see the [Debugging Guide](DEBUGGING.md).
|
||||
|
||||
## How do I enable debug logging?
|
||||
|
||||
To enable debug logging:
|
||||
|
||||
1. Open your `.env` file
|
||||
2. Set `DEBUG=True`
|
||||
3. Restart your application
|
||||
|
||||
For Docker Compose:
|
||||
|
||||
```bash
|
||||
docker compose restart
|
||||
```
|
||||
|
||||
For development:
|
||||
|
||||
```bash
|
||||
uv run main.py
|
||||
```
|
||||
|
||||
You'll now see detailed logs including:
|
||||
|
||||
- System prompts sent to LLMs
|
||||
- Model responses
|
||||
- Tool execution details
|
||||
- Workflow state transitions
|
||||
|
||||
See the [Debugging Guide](DEBUGGING.md) for more options.
|
||||
|
||||
## How do I troubleshoot issues?
|
||||
|
||||
When encountering issues:
|
||||
|
||||
1. **Check the logs**: Enable debug logging as described above
|
||||
2. **Review configuration**: Ensure your `.env` and `conf.yaml` are correct
|
||||
3. **Check existing issues**: Search [GitHub Issues](https://github.com/bytedance/deer-flow/issues) for similar problems
|
||||
4. **Enable verbose logging**: Use `LANGCHAIN_VERBOSE=true` for detailed output
|
||||
5. **Use LangSmith**: For visual debugging, enable LangSmith tracing
|
||||
|
||||
For Docker-specific issues:
|
||||
|
||||
```bash
|
||||
# View logs
|
||||
docker compose logs -f
|
||||
|
||||
# Check container status
|
||||
docker compose ps
|
||||
|
||||
# Restart services
|
||||
docker compose restart
|
||||
```
|
||||
|
||||
For more detailed troubleshooting steps, see the [Debugging Guide](DEBUGGING.md).
|
||||
|
||||
Reference in New Issue
Block a user