mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-02 22:02:13 +08:00
doc: provide a workable guideline update for ollama user (#323)
This commit is contained in:
@@ -61,12 +61,13 @@ BASIC_MODEL:
|
||||
### How to use Ollama models?
|
||||
|
||||
DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br>
|
||||
The following is a configuration example of `conf.yaml` for using Ollama models:
|
||||
The following is a configuration example of `conf.yaml` for using Ollama models(you might need to run the 'ollama serve' first):
|
||||
|
||||
```yaml
|
||||
BASIC_MODEL:
|
||||
model: "ollama/ollama-model-name"
|
||||
base_url: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve
|
||||
model: "model-name" # Model name, which supports the completions API(important), such as: qwen3:8b, mistral-small3.1:24b, qwen2.5:3b
|
||||
base_url: "http://localhost:11434/v1" # Local service address of Ollama, which can be started/viewed via ollama serve
|
||||
api_key: "whatever" # Mandatory, fake api_key with a random string you like :-)
|
||||
```
|
||||
|
||||
### How to use OpenRouter models?
|
||||
|
||||
Reference in New Issue
Block a user