Files
deer-flow/scripts/config-upgrade.sh
DanielWalnut 76803b826f refactor: split backend into harness (deerflow.*) and app (app.*) (#1131)
* refactor: extract shared utils to break harness→app cross-layer imports

Move _validate_skill_frontmatter to src/skills/validation.py and
CONVERTIBLE_EXTENSIONS + convert_file_to_markdown to src/utils/file_conversion.py.
This eliminates the two reverse dependencies from client.py (harness layer)
into gateway/routers/ (app layer), preparing for the harness/app package split.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* refactor: split backend/src into harness (deerflow.*) and app (app.*)

Physically split the monolithic backend/src/ package into two layers:

- **Harness** (`packages/harness/deerflow/`): publishable agent framework
  package with import prefix `deerflow.*`. Contains agents, sandbox, tools,
  models, MCP, skills, config, and all core infrastructure.

- **App** (`app/`): unpublished application code with import prefix `app.*`.
  Contains gateway (FastAPI REST API) and channels (IM integrations).

Key changes:
- Move 13 harness modules to packages/harness/deerflow/ via git mv
- Move gateway + channels to app/ via git mv
- Rename all imports: src.* → deerflow.* (harness) / app.* (app layer)
- Set up uv workspace with deerflow-harness as workspace member
- Update langgraph.json, config.example.yaml, all scripts, Docker files
- Add build-system (hatchling) to harness pyproject.toml
- Add PYTHONPATH=. to gateway startup commands for app.* resolution
- Update ruff.toml with known-first-party for import sorting
- Update all documentation to reflect new directory structure

Boundary rule enforced: harness code never imports from app.
All 429 tests pass. Lint clean.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* chore: add harness→app boundary check test and update docs

Add test_harness_boundary.py that scans all Python files in
packages/harness/deerflow/ and fails if any `from app.*` or
`import app.*` statement is found. This enforces the architectural
rule that the harness layer never depends on the app layer.

Update CLAUDE.md to document the harness/app split architecture,
import conventions, and the boundary enforcement test.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* feat: add config versioning with auto-upgrade on startup

When config.example.yaml schema changes, developers' local config.yaml
files can silently become outdated. This adds a config_version field and
auto-upgrade mechanism so breaking changes (like src.* → deerflow.*
renames) are applied automatically before services start.

- Add config_version: 1 to config.example.yaml
- Add startup version check warning in AppConfig.from_file()
- Add scripts/config-upgrade.sh with migration registry for value replacements
- Add `make config-upgrade` target
- Auto-run config-upgrade in serve.sh and start-daemon.sh before starting services
- Add config error hints in service failure messages

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix comments

* fix: update src.* import in test_sandbox_tools_security to deerflow.*

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: handle empty config and search parent dirs for config.example.yaml

Address Copilot review comments on PR #1131:
- Guard against yaml.safe_load() returning None for empty config files
- Search parent directories for config.example.yaml instead of only
  looking next to config.yaml, fixing detection in common setups

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* fix: correct skills root path depth and config_version type coercion

- loader.py: fix get_skills_root_path() to use 5 parent levels (was 3)
  after harness split, file lives at packages/harness/deerflow/skills/
  so parent×3 resolved to backend/packages/harness/ instead of backend/
- app_config.py: coerce config_version to int() before comparison in
  _check_config_version() to prevent TypeError when YAML stores value
  as string (e.g. config_version: "1")
- tests: add regression tests for both fixes

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: update test imports from src.* to deerflow.*/app.* after harness refactor

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-14 22:55:52 +08:00

147 lines
4.8 KiB
Bash
Executable File

#!/usr/bin/env bash
#
# config-upgrade.sh - Upgrade config.yaml to match config.example.yaml
#
# 1. Runs version-specific migrations (value replacements, renames, etc.)
# 2. Merges missing fields from the example into the user config
# 3. Backs up config.yaml to config.yaml.bak before modifying.
set -e
REPO_ROOT="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
EXAMPLE="$REPO_ROOT/config.example.yaml"
# Resolve config.yaml location: env var > backend/ > repo root
if [ -n "$DEER_FLOW_CONFIG_PATH" ] && [ -f "$DEER_FLOW_CONFIG_PATH" ]; then
CONFIG="$DEER_FLOW_CONFIG_PATH"
elif [ -f "$REPO_ROOT/backend/config.yaml" ]; then
CONFIG="$REPO_ROOT/backend/config.yaml"
elif [ -f "$REPO_ROOT/config.yaml" ]; then
CONFIG="$REPO_ROOT/config.yaml"
else
CONFIG=""
fi
if [ ! -f "$EXAMPLE" ]; then
echo "✗ config.example.yaml not found at $EXAMPLE"
exit 1
fi
if [ -z "$CONFIG" ]; then
echo "No config.yaml found — creating from example..."
cp "$EXAMPLE" "$REPO_ROOT/config.yaml"
echo "✓ config.yaml created. Please review and set your API keys."
exit 0
fi
# Use inline Python to do migrations + recursive merge with PyYAML
cd "$REPO_ROOT/backend" && uv run python3 -c "
import sys, shutil, copy, re
from pathlib import Path
import yaml
config_path = Path('$CONFIG')
example_path = Path('$EXAMPLE')
with open(config_path, encoding='utf-8') as f:
raw_text = f.read()
user = yaml.safe_load(raw_text) or {}
with open(example_path, encoding='utf-8') as f:
example = yaml.safe_load(f) or {}
user_version = user.get('config_version', 0)
example_version = example.get('config_version', 0)
if user_version >= example_version:
print(f'✓ config.yaml is already up to date (version {user_version}).')
sys.exit(0)
print(f'Upgrading config.yaml: version {user_version} → {example_version}')
print()
# ── Migrations ───────────────────────────────────────────────────────────
# Each migration targets a specific version upgrade.
# 'replacements': list of (old_string, new_string) applied to the raw YAML text.
# This handles value changes that a dict merge cannot catch.
MIGRATIONS = {
1: {
'description': 'Rename src.* module paths to deerflow.*',
'replacements': [
('src.community.', 'deerflow.community.'),
('src.sandbox.', 'deerflow.sandbox.'),
('src.models.', 'deerflow.models.'),
('src.tools.', 'deerflow.tools.'),
],
},
# Future migrations go here:
# 2: {
# 'description': '...',
# 'replacements': [('old', 'new')],
# },
}
# Apply migrations in order for versions (user_version, example_version]
migrated = []
for version in range(user_version + 1, example_version + 1):
migration = MIGRATIONS.get(version)
if not migration:
continue
desc = migration.get('description', f'Migration to v{version}')
for old, new in migration.get('replacements', []):
if old in raw_text:
raw_text = raw_text.replace(old, new)
migrated.append(f'{old} → {new}')
# Re-parse after text migrations
user = yaml.safe_load(raw_text) or {}
if migrated:
print(f'Applied {len(migrated)} migration(s):')
for m in migrated:
print(f' ~ {m}')
print()
# ── Merge missing fields ─────────────────────────────────────────────────
added = []
def merge(target, source, path=''):
\"\"\"Recursively merge source into target, adding missing keys only.\"\"\"
for key, value in source.items():
key_path = f'{path}.{key}' if path else key
if key not in target:
target[key] = copy.deepcopy(value)
added.append(key_path)
elif isinstance(value, dict) and isinstance(target[key], dict):
merge(target[key], value, key_path)
merge(user, example)
# Always update config_version
user['config_version'] = example_version
# ── Write ─────────────────────────────────────────────────────────────────
backup = config_path.with_suffix('.yaml.bak')
shutil.copy2(config_path, backup)
print(f'Backed up to {backup.name}')
with open(config_path, 'w', encoding='utf-8') as f:
yaml.dump(user, f, default_flow_style=False, allow_unicode=True, sort_keys=False)
if added:
print(f'Added {len(added)} new field(s):')
for a in added:
print(f' + {a}')
if not migrated and not added:
print('No changes needed (version bumped only).')
print()
print(f'✓ config.yaml upgraded to version {example_version}.')
print(' Please review the changes and set any new required values.')
"