diff --git a/docs/configuration_guide.md b/docs/configuration_guide.md index 1ffb74c..ee9f73d 100644 --- a/docs/configuration_guide.md +++ b/docs/configuration_guide.md @@ -61,12 +61,13 @@ BASIC_MODEL: ### How to use Ollama models? DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama).
-The following is a configuration example of `conf.yaml` for using Ollama models: +The following is a configuration example of `conf.yaml` for using Ollama models(you might need to run the 'ollama serve' first): ```yaml BASIC_MODEL: - model: "ollama/ollama-model-name" - base_url: "http://localhost:11434" # Local service address of Ollama, which can be started/viewed via ollama serve + model: "model-name" # Model name, which supports the completions API(important), such as: qwen3:8b, mistral-small3.1:24b, qwen2.5:3b + base_url: "http://localhost:11434/v1" # Local service address of Ollama, which can be started/viewed via ollama serve + api_key: "whatever" # Mandatory, fake api_key with a random string you like :-) ``` ### How to use OpenRouter models?