feat: add configurable log level and token usage tracking (#1301)

* feat: add configurable log level and token usage tracking

- Add `log_level` config to control deerflow module log level, synced
  to LangGraph Server via serve.sh `--server-log-level`
- Add `token_usage.enabled` config with TokenUsageMiddleware that logs
  input/output/total tokens per LLM call from usage_metadata
- Add .omc/ to .gitignore

* fix: use info level for token usage logs since feature has its own toggle

* fix: sort imports to pass lint check

---------

Co-authored-by: greatmengqi <chenmengqi.0376@bytedance.com>
Co-authored-by: Willem Jiang <willem.jiang@gmail.com>
This commit is contained in:
greatmengqi
2026-03-25 08:13:26 +08:00
committed by GitHub
parent 77b8ef79ca
commit 16ed797e0e
7 changed files with 74 additions and 3 deletions

View File

@@ -14,6 +14,20 @@
# Run `make config-upgrade` to merge new fields into your local config.yaml.
config_version: 3
# ============================================================================
# Logging
# ============================================================================
# Log level for deerflow modules (debug/info/warning/error)
log_level: info
# ============================================================================
# Token Usage Tracking
# ============================================================================
# Track LLM token usage per model call (input/output/total tokens)
# Logs at info level via TokenUsageMiddleware
token_usage:
enabled: false
# ============================================================================
# Models Configuration
# ============================================================================