mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-20 12:54:45 +08:00
refactor: extract shared skill installer and upload manager to harness (#1202)
* refactor: extract shared skill installer and upload manager to harness Move duplicated business logic from Gateway routers and Client into shared harness modules, eliminating code duplication. New shared modules: - deerflow.skills.installer: 6 functions (zip security, extraction, install) - deerflow.uploads.manager: 7 functions (normalize, deduplicate, validate, list, delete, get_uploads_dir, ensure_uploads_dir) Key improvements: - SkillAlreadyExistsError replaces stringly-typed 409 status routing - normalize_filename rejects backslash-containing filenames - Read paths (list/delete) no longer mkdir via get_uploads_dir - Write paths use ensure_uploads_dir for explicit directory creation - list_files_in_dir does stat inside scandir context (no re-stat) - install_skill_from_archive uses single is_file() check (one syscall) - Fix agent config key not reset on update_mcp_config/update_skill Tests: 42 new (22 installer + 20 upload manager) + client hardening * refactor: centralize upload URL construction and clean up installer - Extract upload_virtual_path(), upload_artifact_url(), enrich_file_listing() into shared manager.py, eliminating 6 duplicated URL constructions across Gateway router and Client - Derive all upload URLs from VIRTUAL_PATH_PREFIX constant instead of hardcoded "mnt/user-data/uploads" strings - Eliminate TOCTOU pre-checks and double file read in installer — single ZipFile() open with exception handling replaces is_file() + is_zipfile() + ZipFile() sequence - Add missing re-exports: ensure_uploads_dir in uploads/__init__.py, SkillAlreadyExistsError in skills/__init__.py - Remove redundant .lower() on already-lowercase CONVERTIBLE_EXTENSIONS - Hoist sandbox_uploads_dir(thread_id) before loop in uploads router * fix: add input validation for thread_id and filename length - Reject thread_id containing unsafe filesystem characters (only allow alphanumeric, hyphens, underscores, dots) — prevents 500 on inputs like <script> or shell metacharacters - Reject filenames longer than 255 bytes (OS limit) in normalize_filename - Gateway upload router maps ValueError to 400 for invalid thread_id * fix: address PR review — symlink safety, input validation coverage, error ordering - list_files_in_dir: use follow_symlinks=False to prevent symlink metadata leakage; check is_dir() instead of exists() for non-directory paths - install_skill_from_archive: restore is_file() pre-check before extension validation so error messages match the documented exception contract - validate_thread_id: move from ensure_uploads_dir to get_uploads_dir so all entry points (upload/list/delete) are protected - delete_uploaded_file: catch ValueError from thread_id validation (was 500) - requires_llm marker: also skip when OPENAI_API_KEY is unset - e2e fixture: update TitleMiddleware exclusion comment (kept filtering — middleware triggers extra LLM calls that add non-determinism to tests) * chore: revert uv.lock to main — no dependency changes in this PR * fix: use monkeypatch for global config in e2e fixture to prevent test pollution The e2e_env fixture was calling set_title_config() and set_summarization_config() directly, which mutated global singletons without automatic cleanup. When pytest ran test_client_e2e.py before test_title_middleware_core_logic.py, the leaked enabled=False caused 5 title tests to fail in CI. Switched to monkeypatch.setattr on the module-level private variables so pytest restores the originals after each test. * fix: address code review — URL encoding, API consistency, test isolation - upload_artifact_url: percent-encode filename to handle spaces/#/? - deduplicate_filename: mutate seen set in place (caller no longer needs manual .add() — less error-prone API) - list_files_in_dir: document that size is int, enrich stringifies - e2e fixture: monkeypatch _app_config instead of set_app_config() to prevent global singleton pollution (same pattern as title/summarization fix) - _make_e2e_config: read LLM connection details from env vars so external contributors can override defaults - Update tests to match new deduplicate_filename contract * docs: rewrite RFC in English and add alternatives/breaking changes sections * fix: address code review feedback on PR #1202 - Rename deduplicate_filename to claim_unique_filename to make the in-place set mutation explicit in the function name - Replace PermissionError with PathTraversalError(ValueError) for path traversal detection — malformed input is 400, not 403 * fix: set _app_config_is_custom in e2e test fixture to prevent config.yaml lookup in CI --------- Co-authored-by: greatmengqi <chenmengqi.0376@bytedance.com> Co-authored-by: Willem Jiang <willem.jiang@gmail.com> Co-authored-by: DanielWalnut <45447813+hetaoBackend@users.noreply.github.com>
This commit is contained in:
@@ -9,7 +9,7 @@ from pathlib import Path
|
||||
from unittest.mock import MagicMock, patch
|
||||
|
||||
import pytest
|
||||
from langchain_core.messages import AIMessage, HumanMessage, ToolMessage # noqa: F401
|
||||
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage, ToolMessage # noqa: F401
|
||||
|
||||
from app.gateway.routers.mcp import McpConfigResponse
|
||||
from app.gateway.routers.memory import MemoryConfigResponse, MemoryStatusResponse
|
||||
@@ -17,6 +17,8 @@ from app.gateway.routers.models import ModelResponse, ModelsListResponse
|
||||
from app.gateway.routers.skills import SkillInstallResponse, SkillResponse, SkillsListResponse
|
||||
from app.gateway.routers.uploads import UploadResponse
|
||||
from deerflow.client import DeerFlowClient
|
||||
from deerflow.config.paths import Paths
|
||||
from deerflow.uploads.manager import PathTraversalError
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Fixtures
|
||||
@@ -609,10 +611,7 @@ class TestSkillsManagement:
|
||||
skills_root = tmp_path / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.loader.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.validation._validate_skill_frontmatter", return_value=(True, "OK", "my-skill")),
|
||||
):
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
result = client.install_skill(archive_path)
|
||||
|
||||
assert result["success"] is True
|
||||
@@ -700,7 +699,7 @@ class TestUploads:
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files("thread-1", [src_file])
|
||||
|
||||
assert result["success"] is True
|
||||
@@ -756,7 +755,7 @@ class TestUploads:
|
||||
return client.upload_files("thread-async", [first, second])
|
||||
|
||||
with (
|
||||
patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir),
|
||||
patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir),
|
||||
patch("deerflow.utils.file_conversion.CONVERTIBLE_EXTENSIONS", {".pdf"}),
|
||||
patch("deerflow.utils.file_conversion.convert_file_to_markdown", side_effect=fake_convert),
|
||||
patch("concurrent.futures.ThreadPoolExecutor", FakeExecutor),
|
||||
@@ -777,7 +776,7 @@ class TestUploads:
|
||||
(uploads_dir / "a.txt").write_text("a")
|
||||
(uploads_dir / "b.txt").write_text("bb")
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.list_uploads("thread-1")
|
||||
|
||||
assert result["count"] == 2
|
||||
@@ -793,7 +792,7 @@ class TestUploads:
|
||||
uploads_dir = Path(tmp)
|
||||
(uploads_dir / "delete-me.txt").write_text("gone")
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.delete_upload("thread-1", "delete-me.txt")
|
||||
|
||||
assert result["success"] is True
|
||||
@@ -802,15 +801,15 @@ class TestUploads:
|
||||
|
||||
def test_delete_upload_not_found(self, client):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=Path(tmp)):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=Path(tmp)):
|
||||
with pytest.raises(FileNotFoundError):
|
||||
client.delete_upload("thread-1", "nope.txt")
|
||||
|
||||
def test_delete_upload_path_traversal(self, client):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
uploads_dir = Path(tmp)
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with pytest.raises(PermissionError):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
with pytest.raises(PathTraversalError):
|
||||
client.delete_upload("thread-1", "../../etc/passwd")
|
||||
|
||||
|
||||
@@ -822,15 +821,12 @@ class TestUploads:
|
||||
class TestArtifacts:
|
||||
def test_get_artifact(self, client):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
user_data_dir = Path(tmp) / "user-data"
|
||||
outputs = user_data_dir / "outputs"
|
||||
paths = Paths(base_dir=tmp)
|
||||
outputs = paths.sandbox_outputs_dir("t1")
|
||||
outputs.mkdir(parents=True)
|
||||
(outputs / "result.txt").write_text("artifact content")
|
||||
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_user_data_dir.return_value = user_data_dir
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=mock_paths):
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
content, mime = client.get_artifact("t1", "mnt/user-data/outputs/result.txt")
|
||||
|
||||
assert content == b"artifact content"
|
||||
@@ -838,13 +834,10 @@ class TestArtifacts:
|
||||
|
||||
def test_get_artifact_not_found(self, client):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
user_data_dir = Path(tmp) / "user-data"
|
||||
user_data_dir.mkdir()
|
||||
paths = Paths(base_dir=tmp)
|
||||
paths.sandbox_user_data_dir("t1").mkdir(parents=True)
|
||||
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_user_data_dir.return_value = user_data_dir
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=mock_paths):
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
with pytest.raises(FileNotFoundError):
|
||||
client.get_artifact("t1", "mnt/user-data/outputs/nope.txt")
|
||||
|
||||
@@ -854,14 +847,11 @@ class TestArtifacts:
|
||||
|
||||
def test_get_artifact_path_traversal(self, client):
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
user_data_dir = Path(tmp) / "user-data"
|
||||
user_data_dir.mkdir()
|
||||
paths = Paths(base_dir=tmp)
|
||||
paths.sandbox_user_data_dir("t1").mkdir(parents=True)
|
||||
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_user_data_dir.return_value = user_data_dir
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=mock_paths):
|
||||
with pytest.raises(PermissionError):
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
with pytest.raises(PathTraversalError):
|
||||
client.get_artifact("t1", "mnt/user-data/../../../etc/passwd")
|
||||
|
||||
|
||||
@@ -1013,7 +1003,7 @@ class TestScenarioFileLifecycle:
|
||||
(tmp_path / "report.txt").write_text("quarterly report data")
|
||||
(tmp_path / "data.csv").write_text("a,b,c\n1,2,3")
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
# Step 1: Upload
|
||||
result = client.upload_files(
|
||||
"t-lifecycle",
|
||||
@@ -1046,15 +1036,16 @@ class TestScenarioFileLifecycle:
|
||||
tmp_path = Path(tmp)
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
user_data_dir = tmp_path / "user-data"
|
||||
outputs_dir = user_data_dir / "outputs"
|
||||
|
||||
paths = Paths(base_dir=tmp_path)
|
||||
outputs_dir = paths.sandbox_outputs_dir("t-artifact")
|
||||
outputs_dir.mkdir(parents=True)
|
||||
|
||||
# Upload phase
|
||||
src_file = tmp_path / "input.txt"
|
||||
src_file.write_text("raw data to process")
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
uploaded = client.upload_files("t-artifact", [src_file])
|
||||
assert len(uploaded["files"]) == 1
|
||||
|
||||
@@ -1062,10 +1053,7 @@ class TestScenarioFileLifecycle:
|
||||
(outputs_dir / "analysis.json").write_text('{"result": "processed"}')
|
||||
|
||||
# Retrieve artifact
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_user_data_dir.return_value = user_data_dir
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=mock_paths):
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
content, mime = client.get_artifact("t-artifact", "mnt/user-data/outputs/analysis.json")
|
||||
|
||||
assert json.loads(content) == {"result": "processed"}
|
||||
@@ -1286,7 +1274,7 @@ class TestScenarioThreadIsolation:
|
||||
def get_dir(thread_id):
|
||||
return uploads_a if thread_id == "thread-a" else uploads_b
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", side_effect=get_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", side_effect=get_dir), patch("deerflow.client.ensure_uploads_dir", side_effect=get_dir):
|
||||
client.upload_files("thread-a", [src_file])
|
||||
|
||||
files_a = client.list_uploads("thread-a")
|
||||
@@ -1298,18 +1286,13 @@ class TestScenarioThreadIsolation:
|
||||
def test_artifacts_isolated_per_thread(self, client):
|
||||
"""Artifacts in thread-A are not accessible from thread-B."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
paths = Paths(base_dir=tmp)
|
||||
outputs_a = paths.sandbox_outputs_dir("thread-a")
|
||||
outputs_a.mkdir(parents=True)
|
||||
paths.sandbox_user_data_dir("thread-b").mkdir(parents=True)
|
||||
(outputs_a / "result.txt").write_text("thread-a artifact")
|
||||
|
||||
data_a = tmp_path / "thread-a"
|
||||
data_b = tmp_path / "thread-b"
|
||||
(data_a / "outputs").mkdir(parents=True)
|
||||
(data_b / "outputs").mkdir(parents=True)
|
||||
(data_a / "outputs" / "result.txt").write_text("thread-a artifact")
|
||||
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_user_data_dir.side_effect = lambda tid: data_a if tid == "thread-a" else data_b
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=mock_paths):
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
content, _ = client.get_artifact("thread-a", "mnt/user-data/outputs/result.txt")
|
||||
assert content == b"thread-a artifact"
|
||||
|
||||
@@ -1377,10 +1360,7 @@ class TestScenarioSkillInstallAndUse:
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
# Step 1: Install
|
||||
with (
|
||||
patch("deerflow.skills.loader.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.validation._validate_skill_frontmatter", return_value=(True, "OK", "my-analyzer")),
|
||||
):
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
result = client.install_skill(archive)
|
||||
assert result["success"] is True
|
||||
assert (skills_root / "custom" / "my-analyzer" / "SKILL.md").exists()
|
||||
@@ -1512,7 +1492,7 @@ class TestScenarioEdgeCases:
|
||||
pdf_file.write_bytes(b"%PDF-1.4 fake content")
|
||||
|
||||
with (
|
||||
patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir),
|
||||
patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir),
|
||||
patch("deerflow.utils.file_conversion.CONVERTIBLE_EXTENSIONS", {".pdf"}),
|
||||
patch("deerflow.utils.file_conversion.convert_file_to_markdown", side_effect=Exception("conversion failed")),
|
||||
):
|
||||
@@ -1614,9 +1594,7 @@ class TestGatewayConformance:
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.write(skill_dir / "SKILL.md", "my-skill/SKILL.md")
|
||||
|
||||
custom_dir = tmp_path / "custom"
|
||||
custom_dir.mkdir()
|
||||
with patch("deerflow.skills.loader.get_skills_root_path", return_value=tmp_path):
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=tmp_path):
|
||||
result = client.install_skill(archive)
|
||||
|
||||
parsed = SkillInstallResponse(**result)
|
||||
@@ -1680,7 +1658,7 @@ class TestGatewayConformance:
|
||||
src_file = tmp_path / "hello.txt"
|
||||
src_file.write_text("hello")
|
||||
|
||||
with patch.object(DeerFlowClient, "_get_uploads_dir", return_value=uploads_dir):
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files("t-conform", [src_file])
|
||||
|
||||
parsed = UploadResponse(**result)
|
||||
@@ -1739,3 +1717,694 @@ class TestGatewayConformance:
|
||||
parsed = MemoryStatusResponse(**result)
|
||||
assert parsed.config.enabled is True
|
||||
assert parsed.data.version == "1.0"
|
||||
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — install_skill security gates
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestInstallSkillSecurity:
|
||||
"""Every security gate in install_skill() must have a red-line test."""
|
||||
|
||||
def test_zip_bomb_rejected(self, client):
|
||||
"""Archives whose extracted size exceeds the limit are rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
archive = Path(tmp) / "bomb.skill"
|
||||
# Create a small archive that claims huge uncompressed size.
|
||||
# Write 200 bytes but the safe_extract checks cumulative file_size.
|
||||
data = b"\x00" * 200
|
||||
with zipfile.ZipFile(archive, "w", compression=zipfile.ZIP_DEFLATED) as zf:
|
||||
zf.writestr("big.bin", data)
|
||||
|
||||
skills_root = Path(tmp) / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
# Patch max_total_size to a small value to trigger the bomb check.
|
||||
from deerflow.skills import installer as _installer
|
||||
orig = _installer.safe_extract_skill_archive
|
||||
|
||||
def patched_extract(zf, dest, max_total_size=100):
|
||||
return orig(zf, dest, max_total_size=100)
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.installer.safe_extract_skill_archive", side_effect=patched_extract),
|
||||
):
|
||||
with pytest.raises(ValueError, match="too large"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_absolute_path_in_archive_rejected(self, client):
|
||||
"""ZIP entries with absolute paths are rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
archive = Path(tmp) / "abs.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.writestr("/etc/passwd", "root:x:0:0")
|
||||
|
||||
skills_root = Path(tmp) / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
with pytest.raises(ValueError, match="unsafe"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_dotdot_path_in_archive_rejected(self, client):
|
||||
"""ZIP entries with '..' path components are rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
archive = Path(tmp) / "traversal.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.writestr("skill/../../../etc/shadow", "bad")
|
||||
|
||||
skills_root = Path(tmp) / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
with pytest.raises(ValueError, match="unsafe"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_symlinks_skipped_during_extraction(self, client):
|
||||
"""Symlink entries in the archive are skipped (never written to disk)."""
|
||||
import stat as stat_mod
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
|
||||
archive = tmp_path / "sym-skill.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.writestr("sym-skill/SKILL.md", "---\nname: sym-skill\ndescription: test\n---\nBody")
|
||||
# Inject a symlink entry via ZipInfo with Unix symlink mode.
|
||||
link_info = zipfile.ZipInfo("sym-skill/sneaky_link")
|
||||
link_info.external_attr = (stat_mod.S_IFLNK | 0o777) << 16
|
||||
zf.writestr(link_info, "/etc/passwd")
|
||||
|
||||
skills_root = tmp_path / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
result = client.install_skill(archive)
|
||||
|
||||
assert result["success"] is True
|
||||
installed = skills_root / "custom" / "sym-skill"
|
||||
assert (installed / "SKILL.md").exists()
|
||||
assert not (installed / "sneaky_link").exists()
|
||||
|
||||
def test_invalid_skill_name_rejected(self, client):
|
||||
"""Skill names containing special characters are rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
|
||||
skill_dir = tmp_path / "bad-name"
|
||||
skill_dir.mkdir()
|
||||
(skill_dir / "SKILL.md").write_text("---\nname: ../evil\ndescription: test\n---\n")
|
||||
|
||||
archive = tmp_path / "bad.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.write(skill_dir / "SKILL.md", "bad-name/SKILL.md")
|
||||
|
||||
skills_root = tmp_path / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.installer._validate_skill_frontmatter", return_value=(True, "OK", "../evil")),
|
||||
):
|
||||
with pytest.raises(ValueError, match="Invalid skill name"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_existing_skill_rejected(self, client):
|
||||
"""Installing a skill that already exists is rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
|
||||
skill_dir = tmp_path / "dupe-skill"
|
||||
skill_dir.mkdir()
|
||||
(skill_dir / "SKILL.md").write_text("---\nname: dupe-skill\ndescription: test\n---\n")
|
||||
|
||||
archive = tmp_path / "dupe-skill.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.write(skill_dir / "SKILL.md", "dupe-skill/SKILL.md")
|
||||
|
||||
skills_root = tmp_path / "skills"
|
||||
(skills_root / "custom" / "dupe-skill").mkdir(parents=True)
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.installer._validate_skill_frontmatter", return_value=(True, "OK", "dupe-skill")),
|
||||
):
|
||||
with pytest.raises(ValueError, match="already exists"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_empty_archive_rejected(self, client):
|
||||
"""An archive with no entries is rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
archive = Path(tmp) / "empty.skill"
|
||||
with zipfile.ZipFile(archive, "w"):
|
||||
pass # empty archive
|
||||
|
||||
skills_root = Path(tmp) / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root):
|
||||
with pytest.raises(ValueError, match="empty"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_invalid_frontmatter_rejected(self, client):
|
||||
"""Archive with invalid SKILL.md frontmatter is rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
skill_dir = tmp_path / "bad-meta"
|
||||
skill_dir.mkdir()
|
||||
(skill_dir / "SKILL.md").write_text("no frontmatter at all")
|
||||
|
||||
archive = tmp_path / "bad-meta.skill"
|
||||
with zipfile.ZipFile(archive, "w") as zf:
|
||||
zf.write(skill_dir / "SKILL.md", "bad-meta/SKILL.md")
|
||||
|
||||
skills_root = tmp_path / "skills"
|
||||
(skills_root / "custom").mkdir(parents=True)
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.installer.get_skills_root_path", return_value=skills_root),
|
||||
patch("deerflow.skills.installer._validate_skill_frontmatter", return_value=(False, "Missing name field", "")),
|
||||
):
|
||||
with pytest.raises(ValueError, match="Invalid skill"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_not_a_zip_rejected(self, client):
|
||||
"""A .skill file that is not a valid ZIP is rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
archive = Path(tmp) / "fake.skill"
|
||||
archive.write_text("this is not a zip file")
|
||||
|
||||
with pytest.raises(ValueError, match="not a valid ZIP"):
|
||||
client.install_skill(archive)
|
||||
|
||||
def test_directory_path_rejected(self, client):
|
||||
"""Passing a directory instead of a file is rejected."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
with pytest.raises(ValueError, match="not a file"):
|
||||
client.install_skill(tmp)
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — _atomic_write_json error paths
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestAtomicWriteJson:
|
||||
def test_temp_file_cleaned_on_serialization_failure(self):
|
||||
"""If json.dump raises, the temp file is removed."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
target = Path(tmp) / "config.json"
|
||||
|
||||
# An object that cannot be serialized to JSON.
|
||||
bad_data = {"key": object()}
|
||||
|
||||
with pytest.raises(TypeError):
|
||||
DeerFlowClient._atomic_write_json(target, bad_data)
|
||||
|
||||
# Target should not have been created.
|
||||
assert not target.exists()
|
||||
# No stray .tmp files should remain.
|
||||
tmp_files = list(Path(tmp).glob("*.tmp"))
|
||||
assert tmp_files == []
|
||||
|
||||
def test_happy_path_writes_atomically(self):
|
||||
"""Normal write produces correct JSON and no temp files."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
target = Path(tmp) / "out.json"
|
||||
data = {"key": "value", "nested": [1, 2, 3]}
|
||||
|
||||
DeerFlowClient._atomic_write_json(target, data)
|
||||
|
||||
assert target.exists()
|
||||
with open(target) as f:
|
||||
loaded = json.load(f)
|
||||
assert loaded == data
|
||||
# No temp files left behind.
|
||||
assert list(Path(tmp).glob("*.tmp")) == []
|
||||
|
||||
def test_original_preserved_on_failure(self):
|
||||
"""If write fails, the original file is not corrupted."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
target = Path(tmp) / "config.json"
|
||||
target.write_text('{"original": true}')
|
||||
|
||||
bad_data = {"key": object()}
|
||||
with pytest.raises(TypeError):
|
||||
DeerFlowClient._atomic_write_json(target, bad_data)
|
||||
|
||||
# Original content must survive.
|
||||
with open(target) as f:
|
||||
assert json.load(f) == {"original": True}
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — config update error paths
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestConfigUpdateErrors:
|
||||
def test_update_mcp_config_no_config_file(self, client):
|
||||
"""FileNotFoundError when extensions_config.json cannot be located."""
|
||||
with patch("deerflow.client.ExtensionsConfig.resolve_config_path", return_value=None):
|
||||
with pytest.raises(FileNotFoundError, match="Cannot locate"):
|
||||
client.update_mcp_config({"server": {}})
|
||||
|
||||
def test_update_skill_no_config_file(self, client):
|
||||
"""FileNotFoundError when extensions_config.json cannot be located."""
|
||||
skill = MagicMock()
|
||||
skill.name = "some-skill"
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.loader.load_skills", return_value=[skill]),
|
||||
patch("deerflow.client.ExtensionsConfig.resolve_config_path", return_value=None),
|
||||
):
|
||||
with pytest.raises(FileNotFoundError, match="Cannot locate"):
|
||||
client.update_skill("some-skill", enabled=False)
|
||||
|
||||
def test_update_skill_disappears_after_write(self, client):
|
||||
"""RuntimeError when skill vanishes between write and re-read."""
|
||||
skill = MagicMock()
|
||||
skill.name = "ghost-skill"
|
||||
|
||||
ext_config = MagicMock()
|
||||
ext_config.mcp_servers = {}
|
||||
ext_config.skills = {}
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
config_file = Path(tmp) / "extensions_config.json"
|
||||
config_file.write_text("{}")
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.loader.load_skills", side_effect=[[skill], []]),
|
||||
patch("deerflow.client.ExtensionsConfig.resolve_config_path", return_value=config_file),
|
||||
patch("deerflow.client.get_extensions_config", return_value=ext_config),
|
||||
patch("deerflow.client.reload_extensions_config"),
|
||||
):
|
||||
with pytest.raises(RuntimeError, match="disappeared"):
|
||||
client.update_skill("ghost-skill", enabled=False)
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — stream / chat edge cases
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestStreamHardening:
|
||||
def test_agent_exception_propagates(self, client):
|
||||
"""Exceptions from agent.stream() propagate to caller."""
|
||||
agent = MagicMock()
|
||||
agent.stream.side_effect = RuntimeError("model quota exceeded")
|
||||
|
||||
with (
|
||||
patch.object(client, "_ensure_agent"),
|
||||
patch.object(client, "_agent", agent),
|
||||
):
|
||||
with pytest.raises(RuntimeError, match="model quota exceeded"):
|
||||
list(client.stream("hi", thread_id="t-err"))
|
||||
|
||||
def test_messages_without_id(self, client):
|
||||
"""Messages without id attribute are emitted without crashing."""
|
||||
ai = AIMessage(content="no id here")
|
||||
# Forcibly remove the id attribute to simulate edge case.
|
||||
object.__setattr__(ai, "id", None)
|
||||
chunks = [{"messages": [ai]}]
|
||||
agent = _make_agent_mock(chunks)
|
||||
|
||||
with (
|
||||
patch.object(client, "_ensure_agent"),
|
||||
patch.object(client, "_agent", agent),
|
||||
):
|
||||
events = list(client.stream("hi", thread_id="t-noid"))
|
||||
|
||||
# Should produce events without error.
|
||||
assert events[-1].type == "end"
|
||||
ai_events = _ai_events(events)
|
||||
assert len(ai_events) == 1
|
||||
assert ai_events[0].data["content"] == "no id here"
|
||||
|
||||
def test_tool_calls_only_no_text(self, client):
|
||||
"""chat() returns empty string when agent only emits tool calls."""
|
||||
ai = AIMessage(
|
||||
content="",
|
||||
id="ai-1",
|
||||
tool_calls=[{"name": "bash", "args": {"cmd": "ls"}, "id": "tc-1"}],
|
||||
)
|
||||
tool = ToolMessage(content="output", id="tm-1", tool_call_id="tc-1", name="bash")
|
||||
chunks = [
|
||||
{"messages": [ai]},
|
||||
{"messages": [ai, tool]},
|
||||
]
|
||||
agent = _make_agent_mock(chunks)
|
||||
|
||||
with (
|
||||
patch.object(client, "_ensure_agent"),
|
||||
patch.object(client, "_agent", agent),
|
||||
):
|
||||
result = client.chat("do it", thread_id="t-tc-only")
|
||||
|
||||
assert result == ""
|
||||
|
||||
def test_duplicate_messages_without_id_not_deduplicated(self, client):
|
||||
"""Messages with id=None are NOT deduplicated (each is emitted)."""
|
||||
ai1 = AIMessage(content="first")
|
||||
ai2 = AIMessage(content="second")
|
||||
object.__setattr__(ai1, "id", None)
|
||||
object.__setattr__(ai2, "id", None)
|
||||
|
||||
chunks = [
|
||||
{"messages": [ai1]},
|
||||
{"messages": [ai2]},
|
||||
]
|
||||
agent = _make_agent_mock(chunks)
|
||||
|
||||
with (
|
||||
patch.object(client, "_ensure_agent"),
|
||||
patch.object(client, "_agent", agent),
|
||||
):
|
||||
events = list(client.stream("hi", thread_id="t-dup-noid"))
|
||||
|
||||
ai_msgs = _ai_events(events)
|
||||
assert len(ai_msgs) == 2
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — _serialize_message coverage
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestSerializeMessage:
|
||||
def test_system_message(self):
|
||||
msg = SystemMessage(content="You are a helpful assistant.", id="sys-1")
|
||||
result = DeerFlowClient._serialize_message(msg)
|
||||
assert result["type"] == "system"
|
||||
assert result["content"] == "You are a helpful assistant."
|
||||
assert result["id"] == "sys-1"
|
||||
|
||||
def test_unknown_message_type(self):
|
||||
"""Non-standard message types serialize as 'unknown'."""
|
||||
msg = MagicMock()
|
||||
msg.id = "unk-1"
|
||||
msg.content = "something"
|
||||
# Not an instance of AIMessage/ToolMessage/HumanMessage/SystemMessage
|
||||
type(msg).__name__ = "CustomMessage"
|
||||
result = DeerFlowClient._serialize_message(msg)
|
||||
assert result["type"] == "unknown"
|
||||
assert result["id"] == "unk-1"
|
||||
|
||||
def test_ai_message_with_tool_calls(self):
|
||||
msg = AIMessage(
|
||||
content="",
|
||||
id="ai-tc",
|
||||
tool_calls=[{"name": "bash", "args": {"cmd": "ls"}, "id": "tc-1"}],
|
||||
)
|
||||
result = DeerFlowClient._serialize_message(msg)
|
||||
assert result["type"] == "ai"
|
||||
assert len(result["tool_calls"]) == 1
|
||||
assert result["tool_calls"][0]["name"] == "bash"
|
||||
|
||||
def test_tool_message_non_string_content(self):
|
||||
msg = ToolMessage(content={"key": "value"}, id="tm-1", tool_call_id="tc-1", name="tool")
|
||||
result = DeerFlowClient._serialize_message(msg)
|
||||
assert result["type"] == "tool"
|
||||
assert isinstance(result["content"], str)
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — upload / delete symlink attack
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestUploadDeleteSymlink:
|
||||
def test_delete_upload_symlink_outside_dir(self, client):
|
||||
"""A symlink in uploads dir pointing outside is caught by path traversal check."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
uploads_dir = Path(tmp) / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
# Create a target file outside uploads dir.
|
||||
outside = Path(tmp) / "secret.txt"
|
||||
outside.write_text("sensitive data")
|
||||
|
||||
# Create a symlink inside uploads dir pointing to outside file.
|
||||
link = uploads_dir / "harmless.txt"
|
||||
link.symlink_to(outside)
|
||||
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
# The resolved path of the symlink escapes uploads_dir,
|
||||
# so path traversal check should catch it.
|
||||
with pytest.raises(PathTraversalError):
|
||||
client.delete_upload("thread-1", "harmless.txt")
|
||||
|
||||
# The outside file must NOT have been deleted.
|
||||
assert outside.exists()
|
||||
|
||||
def test_upload_filename_with_spaces_and_unicode(self, client):
|
||||
"""Files with spaces and unicode characters in names upload correctly."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
weird_name = "report 2024 数据.txt"
|
||||
src_file = tmp_path / weird_name
|
||||
src_file.write_text("data")
|
||||
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files("thread-1", [src_file])
|
||||
|
||||
assert result["success"] is True
|
||||
assert result["files"][0]["filename"] == weird_name
|
||||
assert (uploads_dir / weird_name).exists()
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# Hardening — artifact edge cases
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestArtifactHardening:
|
||||
def test_artifact_directory_rejected(self, client):
|
||||
"""get_artifact rejects paths that resolve to a directory."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
paths = Paths(base_dir=tmp)
|
||||
subdir = paths.sandbox_outputs_dir("t1") / "subdir"
|
||||
subdir.mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
with pytest.raises(ValueError, match="not a file"):
|
||||
client.get_artifact("t1", "mnt/user-data/outputs/subdir")
|
||||
|
||||
def test_artifact_leading_slash_stripped(self, client):
|
||||
"""Paths with leading slash are handled correctly."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
paths = Paths(base_dir=tmp)
|
||||
outputs = paths.sandbox_outputs_dir("t1")
|
||||
outputs.mkdir(parents=True)
|
||||
(outputs / "file.txt").write_text("content")
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
content, _mime = client.get_artifact("t1", "/mnt/user-data/outputs/file.txt")
|
||||
|
||||
assert content == b"content"
|
||||
|
||||
|
||||
# ===========================================================================
|
||||
# BUG DETECTION — tests that expose real bugs in client.py
|
||||
# ===========================================================================
|
||||
|
||||
|
||||
class TestUploadDuplicateFilenames:
|
||||
"""Regression: upload_files must auto-rename duplicate basenames.
|
||||
|
||||
Previously it silently overwrote the first file with the second,
|
||||
then reported both in the response while only one existed on disk.
|
||||
Now duplicates are renamed (data.txt → data_1.txt) and the response
|
||||
includes original_filename so the agent / caller can see what happened.
|
||||
"""
|
||||
|
||||
def test_duplicate_filenames_auto_renamed(self, client):
|
||||
"""Two files with same basename → second gets _1 suffix."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
dir_a = tmp_path / "a"
|
||||
dir_b = tmp_path / "b"
|
||||
dir_a.mkdir()
|
||||
dir_b.mkdir()
|
||||
(dir_a / "data.txt").write_text("version A")
|
||||
(dir_b / "data.txt").write_text("version B")
|
||||
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files("t-dup", [dir_a / "data.txt", dir_b / "data.txt"])
|
||||
|
||||
assert result["success"] is True
|
||||
assert len(result["files"]) == 2
|
||||
|
||||
# Both files exist on disk with distinct names.
|
||||
disk_files = sorted(p.name for p in uploads_dir.iterdir())
|
||||
assert disk_files == ["data.txt", "data_1.txt"]
|
||||
|
||||
# First keeps original name, second is renamed.
|
||||
assert result["files"][0]["filename"] == "data.txt"
|
||||
assert "original_filename" not in result["files"][0]
|
||||
|
||||
assert result["files"][1]["filename"] == "data_1.txt"
|
||||
assert result["files"][1]["original_filename"] == "data.txt"
|
||||
|
||||
# Content preserved correctly.
|
||||
assert (uploads_dir / "data.txt").read_text() == "version A"
|
||||
assert (uploads_dir / "data_1.txt").read_text() == "version B"
|
||||
|
||||
def test_triple_duplicate_increments_counter(self, client):
|
||||
"""Three files with same basename → _1, _2 suffixes."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
for name in ["x", "y", "z"]:
|
||||
d = tmp_path / name
|
||||
d.mkdir()
|
||||
(d / "report.csv").write_text(f"from {name}")
|
||||
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files(
|
||||
"t-triple",
|
||||
[tmp_path / "x" / "report.csv", tmp_path / "y" / "report.csv", tmp_path / "z" / "report.csv"],
|
||||
)
|
||||
|
||||
filenames = [f["filename"] for f in result["files"]]
|
||||
assert filenames == ["report.csv", "report_1.csv", "report_2.csv"]
|
||||
assert len(list(uploads_dir.iterdir())) == 3
|
||||
|
||||
def test_different_filenames_no_rename(self, client):
|
||||
"""Non-duplicate filenames upload normally without rename."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
tmp_path = Path(tmp)
|
||||
uploads_dir = tmp_path / "uploads"
|
||||
uploads_dir.mkdir()
|
||||
|
||||
(tmp_path / "a.txt").write_text("aaa")
|
||||
(tmp_path / "b.txt").write_text("bbb")
|
||||
|
||||
with patch("deerflow.client.get_uploads_dir", return_value=uploads_dir), patch("deerflow.client.ensure_uploads_dir", return_value=uploads_dir):
|
||||
result = client.upload_files("t-ok", [tmp_path / "a.txt", tmp_path / "b.txt"])
|
||||
|
||||
assert result["success"] is True
|
||||
assert len(result["files"]) == 2
|
||||
assert all("original_filename" not in f for f in result["files"])
|
||||
assert len(list(uploads_dir.iterdir())) == 2
|
||||
|
||||
|
||||
class TestBugArtifactPrefixMatchTooLoose:
|
||||
"""Regression: get_artifact must reject paths like ``mnt/user-data-evil/...``.
|
||||
|
||||
Previously ``startswith("mnt/user-data")`` matched ``"mnt/user-data-evil"``
|
||||
because it was a string prefix, not a path-segment check.
|
||||
"""
|
||||
|
||||
def test_non_canonical_prefix_rejected(self, client):
|
||||
"""Paths that share a string prefix but differ at segment boundary are rejected."""
|
||||
with pytest.raises(ValueError, match="must start with"):
|
||||
client.get_artifact("t1", "mnt/user-data-evil/secret.txt")
|
||||
|
||||
def test_exact_prefix_without_subpath_accepted(self, client):
|
||||
"""Bare 'mnt/user-data' is accepted (will later fail as directory, not at prefix)."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
paths = Paths(base_dir=tmp)
|
||||
paths.sandbox_user_data_dir("t1").mkdir(parents=True)
|
||||
|
||||
with patch("deerflow.client.get_paths", return_value=paths):
|
||||
# Accepted at prefix check, but fails because it's a directory.
|
||||
with pytest.raises(ValueError, match="not a file"):
|
||||
client.get_artifact("t1", "mnt/user-data")
|
||||
|
||||
|
||||
class TestBugListUploadsDeadCode:
|
||||
"""Regression: list_uploads works even when called on a fresh thread
|
||||
(directory does not exist yet — returns empty without creating it).
|
||||
"""
|
||||
|
||||
def test_list_uploads_on_fresh_thread(self, client):
|
||||
"""list_uploads on a thread that never had uploads returns empty list."""
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
non_existent = Path(tmp) / "does-not-exist" / "uploads"
|
||||
assert not non_existent.exists()
|
||||
|
||||
mock_paths = MagicMock()
|
||||
mock_paths.sandbox_uploads_dir.return_value = non_existent
|
||||
|
||||
with patch("deerflow.uploads.manager.get_paths", return_value=mock_paths):
|
||||
result = client.list_uploads("thread-fresh")
|
||||
|
||||
# Read path should NOT create the directory
|
||||
assert not non_existent.exists()
|
||||
assert result == {"files": [], "count": 0}
|
||||
|
||||
|
||||
class TestBugAgentInvalidationInconsistency:
|
||||
"""Regression: update_skill and update_mcp_config must reset both
|
||||
_agent and _agent_config_key, just like reset_agent() does.
|
||||
"""
|
||||
|
||||
def test_update_mcp_resets_config_key(self, client):
|
||||
"""After update_mcp_config, both _agent and _agent_config_key are None."""
|
||||
client._agent = MagicMock()
|
||||
client._agent_config_key = ("model", True, False, False)
|
||||
|
||||
current_config = MagicMock()
|
||||
current_config.skills = {}
|
||||
reloaded = MagicMock()
|
||||
reloaded.mcp_servers = {}
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
config_file = Path(tmp) / "ext.json"
|
||||
config_file.write_text("{}")
|
||||
|
||||
with (
|
||||
patch("deerflow.client.ExtensionsConfig.resolve_config_path", return_value=config_file),
|
||||
patch("deerflow.client.get_extensions_config", return_value=current_config),
|
||||
patch("deerflow.client.reload_extensions_config", return_value=reloaded),
|
||||
):
|
||||
client.update_mcp_config({})
|
||||
|
||||
assert client._agent is None
|
||||
assert client._agent_config_key is None
|
||||
|
||||
def test_update_skill_resets_config_key(self, client):
|
||||
"""After update_skill, both _agent and _agent_config_key are None."""
|
||||
client._agent = MagicMock()
|
||||
client._agent_config_key = ("model", True, False, False)
|
||||
|
||||
skill = MagicMock()
|
||||
skill.name = "s1"
|
||||
updated = MagicMock()
|
||||
updated.name = "s1"
|
||||
updated.description = "d"
|
||||
updated.license = "MIT"
|
||||
updated.category = "c"
|
||||
updated.enabled = False
|
||||
|
||||
ext_config = MagicMock()
|
||||
ext_config.mcp_servers = {}
|
||||
ext_config.skills = {}
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmp:
|
||||
config_file = Path(tmp) / "ext.json"
|
||||
config_file.write_text("{}")
|
||||
|
||||
with (
|
||||
patch("deerflow.skills.loader.load_skills", side_effect=[[skill], [updated]]),
|
||||
patch("deerflow.client.ExtensionsConfig.resolve_config_path", return_value=config_file),
|
||||
patch("deerflow.client.get_extensions_config", return_value=ext_config),
|
||||
patch("deerflow.client.reload_extensions_config"),
|
||||
):
|
||||
client.update_skill("s1", enabled=False)
|
||||
|
||||
assert client._agent is None
|
||||
assert client._agent_config_key is None
|
||||
|
||||
Reference in New Issue
Block a user