Add support for self-signed certs from model providers (#276)

* Add support for self-signed certs from model providers

* cleanup

---------

Co-authored-by: tonydoesathing <tmastromarino@cpacket.com>
Co-authored-by: Willem Jiang <willem.jiang@gmail.com>
This commit is contained in:
Tony M
2025-06-25 19:17:26 -07:00
committed by GitHub
parent 9c2d4724e3
commit b7373fbe70
3 changed files with 31 additions and 1 deletions

View File

@@ -3,13 +3,15 @@
# configurations to match your specific settings and requirements.
# - Replace `api_key` with your own credentials.
# - Replace `base_url` and `model` name if you want to use a custom model.
# - Set `verify_ssl` to `false` if your LLM server uses self-signed certificates
# - A restart is required every time you change the `config.yaml` file.
BASIC_MODEL:
base_url: https://ark.cn-beijing.volces.com/api/v3
model: "doubao-1-5-pro-32k-250115"
api_key: xxxx
# verify_ssl: false # Uncomment this line to disable SSL certificate verification for self-signed certificates
# Reasoning model is optional.
# Uncomment the following settings if you want to use reasoning model
# for planning.

View File

@@ -58,6 +58,21 @@ BASIC_MODEL:
api_key: YOUR_API_KEY
```
### How to use models with self-signed SSL certificates?
If your LLM server uses self-signed SSL certificates, you can disable SSL certificate verification by adding the `verify_ssl: false` parameter to your model configuration:
```yaml
BASIC_MODEL:
base_url: "https://your-llm-server.com/api/v1"
model: "your-model-name"
api_key: YOUR_API_KEY
verify_ssl: false # Disable SSL certificate verification for self-signed certificates
```
> [!WARNING]
> Disabling SSL certificate verification reduces security and should only be used in development environments or when you trust the LLM server. In production environments, it's recommended to use properly signed SSL certificates.
### How to use Ollama models?
DeerFlow supports the integration of Ollama models. You can refer to [litellm Ollama](https://docs.litellm.ai/docs/providers/ollama). <br>

View File

@@ -4,6 +4,8 @@
from pathlib import Path
from typing import Any, Dict
import os
import ssl
import httpx
from langchain_openai import ChatOpenAI
from langchain_deepseek import ChatDeepSeek
@@ -71,6 +73,16 @@ def _create_llm_use_conf(
if llm_type == "reasoning":
merged_conf["api_base"] = merged_conf.pop("base_url", None)
# Handle SSL verification settings
verify_ssl = merged_conf.pop("verify_ssl", True)
# Create custom HTTP client if SSL verification is disabled
if not verify_ssl:
http_client = httpx.Client(verify=False)
http_async_client = httpx.AsyncClient(verify=False)
merged_conf["http_client"] = http_client
merged_conf["http_async_client"] = http_async_client
return (
ChatOpenAI(**merged_conf)
if llm_type != "reasoning"
@@ -78,6 +90,7 @@ def _create_llm_use_conf(
)
def get_llm_by_type(
llm_type: LLMType,
) -> ChatOpenAI: