doc: provide a workable guideline update for ollama user (#323)

This commit is contained in:
Ryan Guo
2025-06-17 08:47:54 +08:00
committed by GitHub
parent 8823ffdb6a
commit e03b12b97f

View File

@@ -61,12 +61,13 @@ BASIC_MODEL:
### How to use Ollama models? ### How to use Ollama models?
DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br> DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br>
The following is a configuration example of `conf.yaml` for using Ollama models: The following is a configuration example of `conf.yaml` for using Ollama models(you might need to run the 'ollama serve' first):
```yaml ```yaml
BASIC_MODEL: BASIC_MODEL:
model: "ollama/ollama-model-name" model: "model-name" # Model name, which supports the completions API(important), such as: qwen3:8b, mistral-small3.1:24b, qwen2.5:3b
base_url: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve base_url: "http://localhost:11434/v1" # Local service address of Ollama, which can be started/viewed via ollama serve
api_key: "whatever" # Mandatory, fake api_key with a random string you like :-)
``` ```
### How to use OpenRouter models? ### How to use OpenRouter models?