hetaoBackend 0ea666e0cf feat: add global memory mechanism for personalized conversations
Implement a memory system that stores user context and conversation history
in memory.json, uses LLM to summarize conversations, and injects relevant
context into system prompts for personalized responses.

Key components:
- MemoryConfig for configuration management
- MemoryUpdateQueue with debounce for batch processing
- MemoryUpdater for LLM-based memory extraction
- MemoryMiddleware to queue conversations after agent execution
- Memory injection into lead agent system prompt

Note: Add memory section to config.yaml to enable (see config.example.yaml)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 13:31:05 +08:00
2026-02-02 23:24:58 +08:00
2026-02-02 13:25:21 +08:00
2026-01-14 07:09:20 +08:00
2026-01-26 13:16:35 +08:00
2026-01-14 07:09:20 +08:00
2026-01-30 22:02:07 +08:00

🦌 DeerFlow - v2

Originated from Open Source, give back to Open Source.

A LangGraph-based AI agent backend with sandbox execution capabilities.

Quick Start

The fastest way to get started with a consistent environment:

  1. Configure the application:

    cp config.example.yaml config.yaml
    # Edit config.yaml and set your API keys
    
  2. Initialize and start:

    make docker-init  # First time only
    make docker-dev   # Start all services
    
  3. Access: http://localhost:2026

See CONTRIBUTING.md for detailed Docker development guide.

Option 2: Local Development

If you prefer running services locally:

  1. Check prerequisites:

    make check  # Verifies Node.js 22+, pnpm, uv, nginx
    
  2. Configure and install:

    cp config.example.yaml config.yaml
    make install
    
  3. Start services:

    make dev
    
  4. Access: http://localhost:2026

See CONTRIBUTING.md for detailed local development guide.

Features

  • 🤖 LangGraph-based Agents - Multi-agent orchestration with sophisticated workflows
  • 🔧 Model Context Protocol (MCP) - Extensible tool integration
  • 🎯 Skills System - Reusable agent capabilities
  • 🛡️ Sandbox Execution - Safe code execution environment
  • 🌐 Unified API Gateway - Single entry point with nginx reverse proxy
  • 🔄 Hot Reload - Fast development iteration
  • 📊 Real-time Streaming - Server-Sent Events (SSE) support

Documentation

Contributing

We welcome contributions! Please see CONTRIBUTING.md for development setup, workflow, and guidelines.

License

This project is open source and available under the MIT License.

Acknowledgments

DeerFlow is built upon the incredible work of the open-source community. We are deeply grateful to all the projects and contributors whose efforts have made DeerFlow possible. Truly, we stand on the shoulders of giants.

We would like to extend our sincere appreciation to the following projects for their invaluable contributions:

  • LangChain: Their exceptional framework powers our LLM interactions and chains, enabling seamless integration and functionality.
  • LangGraph: Their innovative approach to multi-agent orchestration has been instrumental in enabling DeerFlow's sophisticated workflows.

These projects exemplify the transformative power of open-source collaboration, and we are proud to build upon their foundations.

Key Contributors

A heartfelt thank you goes out to the core authors of DeerFlow, whose vision, passion, and dedication have brought this project to life:

Your unwavering commitment and expertise have been the driving force behind DeerFlow's success. We are honored to have you at the helm of this journey.

Star History

Star History Chart

Description
No description provided
Readme MIT 24 MiB
Languages
Python 59.3%
TypeScript 24.3%
HTML 7.3%
CSS 3.2%
JavaScript 2.8%
Other 3.1%