mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-03 06:12:14 +08:00
docs: add document of configurations and FAQs
This commit is contained in:
15
docs/FAQ.md
Normal file
15
docs/FAQ.md
Normal file
@@ -0,0 +1,15 @@
|
||||
# FAQ
|
||||
|
||||
## Table of Contents
|
||||
|
||||
- [Where's the name DeerFlow come from?](#wheres-the-name-deerflow-come-from)
|
||||
|
||||
- [Which models does DeerFlow support?](#which-models-does-deerflow-support)
|
||||
|
||||
## Where's the name DeerFlow come from?
|
||||
|
||||
DeerFlow is short for **D**eep **E**xploration and **E**fficient **R**esearch **Flow**. It is named after the deer, which is a symbol of gentleness and elegance. We hope DeerFlow can bring a gentle and elegant deep research experience to you.
|
||||
|
||||
## Which models does DeerFlow support?
|
||||
|
||||
Please refer to the [Configuration Guide](configuration_guide.md) for more details.
|
||||
87
docs/configuration_guide.md
Normal file
87
docs/configuration_guide.md
Normal file
@@ -0,0 +1,87 @@
|
||||
# Configuration Guide
|
||||
|
||||
## Which models does DeerFlow support?
|
||||
|
||||
In DeerFlow, currently we only support non-reasoning models, which means models like OpenAI's o1/o3 or DeepSeek's R1 are not supported yet, but we will add support for them in the future.
|
||||
|
||||
### Supported Models
|
||||
|
||||
`doubao-1.5-pro-32k-250115`, `gpt-4o`, `qwen-max-latest`, `gemini-2.0-flash`, `deepseek-v3`, and theoretically any other non-reasoning chat models that implement the OpenAI API specification.
|
||||
|
||||
### How to switch models?
|
||||
You can switch the model in use by modifying the `conf.yaml` file in the root directory of the project, using the configuration in the [litellm format](https://docs.litellm.ai/docs/providers/openai_compatible).
|
||||
|
||||
---
|
||||
|
||||
### How to use OpenAI-Compatible models?
|
||||
|
||||
DeerFlow supports integration with OpenAI-Compatible models, which are models that implement the OpenAI API specification. This includes various open-source and commercial models that provide API endpoints compatible with the OpenAI format. You can refer to [litellm OpenAI-Compatible](https://docs.litellm.ai/docs/providers/openai_compatible) for detailed documentation.
|
||||
The following is a configuration example of `conf.yaml` for using OpenAI-Compatible models:
|
||||
|
||||
```yaml
|
||||
# An example of Doubao models served by VolcEngine
|
||||
BASIC_MODEL:
|
||||
api_base: "https://ark.cn-beijing.volces.com/api/v3"
|
||||
model: "doubao/doubao-1.5-pro-32k-250115"
|
||||
api_key: YOUR_API_KEY
|
||||
|
||||
# An example of Aliyun models
|
||||
BASIC_MODEL:
|
||||
api_base: "https://dashscope.aliyuncs.com/compatible-mode/v1"
|
||||
model: "openai/qwen-max-latest"
|
||||
api_key: YOUR_API_KEY
|
||||
|
||||
# An example of deepseek official models
|
||||
BASIC_MODEL:
|
||||
api_base: "https://api.deepseek.com"
|
||||
model: "openai/deepseek-chat"
|
||||
api_key: YOU_API_KEY
|
||||
|
||||
# An example of Google Gemini models using OpenAI-Compatible interface
|
||||
BASIC_MODEL:
|
||||
api_base: "https://generativelanguage.googleapis.com/v1beta/openai/"
|
||||
model: "gemini-2.0-flash"
|
||||
api_key: YOUR_API_KEY
|
||||
```
|
||||
|
||||
### How to use Ollama models?
|
||||
|
||||
DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br>
|
||||
The following is a configuration example of `conf.yaml` for using Ollama models:
|
||||
|
||||
```yaml
|
||||
BASIC_MODEL:
|
||||
model: "ollama/ollama-model-name"
|
||||
api_base: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve
|
||||
```
|
||||
|
||||
### How to use OpenRouter models?
|
||||
|
||||
DeerFlow supports the integration of OpenRouter models. You can refer to [litellm OpenRouter](https://docs.litellm.ai/docs/providers/openrouter). To use OpenRouter models, you need to:
|
||||
1. Obtain the OPENROUTER_API_KEY from OpenRouter (https://openrouter.ai/) and set it in the environment variable.
|
||||
2. Add the `openrouter/` prefix before the model name.
|
||||
3. Configure the correct OpenRouter base URL.
|
||||
|
||||
The following is a configuration example for using OpenRouter models:
|
||||
1. Configure OPENROUTER_API_KEY in the environment variable (such as the `.env` file)
|
||||
```ini
|
||||
OPENROUTER_API_KEY=""
|
||||
```
|
||||
2. Set the model name in `conf.yaml`
|
||||
```yaml
|
||||
BASIC_MODEL:
|
||||
model: "openrouter/google/palm-2-chat-bison"
|
||||
```
|
||||
|
||||
Note: The available models and their exact names may change over time. Please verify the currently available models and their correct identifiers in [OpenRouter's official documentation](https://openrouter.ai/docs).
|
||||
|
||||
### How to use Azure models?
|
||||
|
||||
DeerFlow supports the integration of Azure models. You can refer to [litellm Azure](https://docs.litellm.ai/docs/providers/azure). Configuration example of `conf.yaml`:
|
||||
```yaml
|
||||
BASIC_MODEL:
|
||||
model: "azure/gpt-4o-2024-08-06"
|
||||
api_base: $AZURE_API_BASE
|
||||
api_version: $AZURE_API_VERSION
|
||||
api_key: $AZURE_API_KEY
|
||||
```
|
||||
Reference in New Issue
Block a user