fix: support local models by making thought field optional in Plan model (#601)

- Make thought field optional in Plan model to fix Pydantic validation errors with local models
- Add Ollama configuration example to conf.yaml.example
- Update documentation to include local model support
- Improve planner prompt with better JSON format requirements

Fixes local model integration issues where models like qwen3:14b would fail
due to missing thought field in JSON output.

Co-authored-by: Willem Jiang <willem.jiang@gmail.com>
This commit is contained in:
jimmyuconn1982
2025-09-27 17:48:39 -07:00
committed by GitHub
parent 5f4eb38fdb
commit 24f6905c18
5 changed files with 43 additions and 2 deletions

View File

@@ -17,6 +17,14 @@ In DeerFlow, we currently only support non-reasoning models. This means models l
`doubao-1.5-pro-32k-250115`, `gpt-4o`, `qwen-max-latest`,`qwen3-235b-a22b`,`qwen3-coder`, `gemini-2.0-flash`, `deepseek-v3`, and theoretically any other non-reasoning chat models that implement the OpenAI API specification.
### Local Model Support
DeerFlow supports local models through OpenAI-compatible APIs:
- **Ollama**: `http://localhost:11434/v1` (tested and supported for local development)
See the `conf.yaml.example` file for detailed configuration examples.
> [!NOTE]
> The Deep Research process requires the model to have a **longer context window**, which is not supported by all models.
> A work-around is to set the `Max steps of a research plan` to `2` in the settings dialog located on the top right corner of the web page,