mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-21 05:14:45 +08:00
* fix(memory): inject stored facts into system prompt memory context - add Facts section rendering in format_memory_for_injection - rank facts by confidence and coerce confidence values safely - enforce max token budget while appending fact lines - add regression tests for fact inclusion, ordering, and budget behavior Fixes #1059 * Update the document with the latest status * fix(memory): harden fact injection — NaN/inf confidence, None content, incremental token budget (#1090) * Initial plan * fix(memory): address review feedback on confidence coercion, None content, and token budget Co-authored-by: WillemJiang <219644+WillemJiang@users.noreply.github.com> --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: WillemJiang <219644+WillemJiang@users.noreply.github.com> --------- Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
39 lines
1.2 KiB
Markdown
39 lines
1.2 KiB
Markdown
# Memory System Improvements - Summary
|
|
|
|
## Sync Note (2026-03-10)
|
|
|
|
This summary is synchronized with the `main` branch implementation.
|
|
TF-IDF/context-aware retrieval is **planned**, not merged yet.
|
|
|
|
## Implemented
|
|
|
|
- Accurate token counting with `tiktoken` in memory injection.
|
|
- Facts are injected into `<memory>` prompt content.
|
|
- Facts are ordered by confidence and bounded by `max_injection_tokens`.
|
|
|
|
## Planned (Not Yet Merged)
|
|
|
|
- TF-IDF cosine similarity recall based on recent conversation context.
|
|
- `current_context` parameter for `format_memory_for_injection`.
|
|
- Weighted ranking (`similarity` + `confidence`).
|
|
- Runtime extraction/injection flow for context-aware fact selection.
|
|
|
|
## Why This Sync Was Needed
|
|
|
|
Earlier docs described TF-IDF behavior as already implemented, which did not match code in `main`.
|
|
This mismatch is tracked in issue `#1059`.
|
|
|
|
## Current API Shape
|
|
|
|
```python
|
|
def format_memory_for_injection(memory_data: dict[str, Any], max_tokens: int = 2000) -> str:
|
|
```
|
|
|
|
No `current_context` argument is currently available in `main`.
|
|
|
|
## Verification Pointers
|
|
|
|
- Implementation: `backend/src/agents/memory/prompt.py`
|
|
- Prompt assembly: `backend/src/agents/lead_agent/prompt.py`
|
|
- Regression tests: `backend/tests/test_memory_prompt_injection.py`
|