ruitanglin a4268cb6d3 feat(frontend): unify citation logic and prevent half-finished citations
- Add SafeCitationContent as single component for citation-aware body:
  useParsedCitations + shouldShowCitationLoading; show loading until
  citations complete, then render body with createCitationMarkdownComponents.
  Supports optional remarkPlugins, rehypePlugins, isHuman, img.

- Refactor MessageListItem: assistant message body now uses
  SafeCitationContent only; remove duplicate useParsedCitations,
  shouldShowCitationLoading, createCitationMarkdownComponents and
  CitationsLoadingIndicator logic. Human messages keep plain
  AIElementMessageResponse (no citation parsing).

- Use SafeCitationContent for clarification, present-files (message-list),
  thinking steps and write_file loading (message-group), subtask result
  (subtask-card). Artifact markdown preview keeps same guard
  (shouldShowCitationLoading) with ArtifactFilePreview.

- Unify loading condition: shouldShowCitationLoading(rawContent,
  cleanContent, isLoading) is the single source of truth. Show loading when
  (isLoading && hasCitationsBlock(rawContent)) or when
  (hasCitationsBlock(rawContent) && hasUnreplacedCitationRefs(cleanContent))
  so Pro/Ultra modes also show "loading citations" and half-finished
  [cite-N] never appear.

- message-group write_file: replace hasCitationsBlock + threadIsLoading
  with shouldShowCitationLoading(fileContent, cleanContent,
  threadIsLoading && isLast) for consistency.

- citations/utils: parse incomplete <citations> during streaming;
  remove isCitationsBlockIncomplete; keep hasUnreplacedCitationRefs
  internal; document display rule in file header.

Co-authored-by: Cursor <cursoragent@cursor.com>

---
feat(前端): 统一引用逻辑并杜绝半成品引用

- 新增 SafeCitationContent 作为引用正文的唯一出口:内部使用
  useParsedCitations + shouldShowCitationLoading,在引用未就绪时只显示
  「正在整理引用」,就绪后用 createCitationMarkdownComponents 渲染正文;
  支持可选 remarkPlugins、rehypePlugins、isHuman、img。

- 重构 MessageListItem:助手消息正文仅通过 SafeCitationContent 渲染,
  删除重复的 useParsedCitations、shouldShowCitationLoading、
  createCitationMarkdownComponents、CitationsLoadingIndicator 等逻辑;
  用户消息仍用 AIElementMessageResponse,不做引用解析。

- 澄清、present-files(message-list)、思考步骤与 write_file 加载
  (message-group)、子任务结果(subtask-card)均使用
  SafeCitationContent;Artifact 的 markdown 预览仍用同一 guard
  shouldShowCitationLoading,正文由 ArtifactFilePreview 渲染。

- 统一加载条件:shouldShowCitationLoading(rawContent, cleanContent,
  isLoading) 为唯一判断。在「流式中且已有引用块」或「有引用块且
  cleanContent 中仍有未替换的 [cite-N]」时仅显示加载,从而在 Pro/Ultra
  下也能看到「正在整理引用」,且永不出现半成品 [cite-N]。

- message-group 的 write_file:用 shouldShowCitationLoading(
  fileContent, cleanContent, threadIsLoading && isLast) 替代
  hasCitationsBlock + threadIsLoading,与其他场景一致。

- citations/utils:流式时解析未闭合的 <citations>;移除
  isCitationsBlockIncomplete;hasUnreplacedCitationRefs 保持内部使用;
  在文件头注释中说明展示规则。
2026-02-09 15:01:51 +08:00
2026-02-03 15:21:15 +08:00
2026-02-06 17:48:15 +08:00
2026-01-14 07:09:20 +08:00
2026-01-26 13:16:35 +08:00
2026-02-06 17:48:14 +08:00
2026-01-14 07:09:20 +08:00
2026-02-05 19:59:25 +08:00

🦌 DeerFlow - v2

Originated from Open Source, give back to Open Source.

A LangGraph-based AI agent backend with sandbox execution capabilities.

Quick Start

The fastest way to get started with a consistent environment:

  1. Configure the application:

    cp config.example.yaml config.yaml
    # Edit config.yaml and set your API keys
    
  2. Initialize and start:

    make docker-init  # First time only
    make docker-dev   # Start all services
    
  3. Access: http://localhost:2026

See CONTRIBUTING.md for detailed Docker development guide.

Option 2: Local Development

If you prefer running services locally:

  1. Check prerequisites:

    make check  # Verifies Node.js 22+, pnpm, uv, nginx
    
  2. Configure and install:

    cp config.example.yaml config.yaml
    make install
    
  3. (Optional) Pre-pull sandbox image:

    # Recommended if using Docker/Container-based sandbox
    make setup-sandbox
    
  4. Start services:

    make dev
    
  5. Access: http://localhost:2026

See CONTRIBUTING.md for detailed local development guide.

Features

  • 🤖 LangGraph-based Agents - Multi-agent orchestration with sophisticated workflows
  • 🧠 Persistent Memory - LLM-powered context retention across conversations with automatic fact extraction
  • 🔧 Model Context Protocol (MCP) - Extensible tool integration
  • 🎯 Skills System - Reusable agent capabilities
  • 🛡️ Sandbox Execution - Safe code execution environment
  • 🌐 Unified API Gateway - Single entry point with nginx reverse proxy
  • 🔄 Hot Reload - Fast development iteration
  • 📊 Real-time Streaming - Server-Sent Events (SSE) support

Documentation

Contributing

We welcome contributions! Please see CONTRIBUTING.md for development setup, workflow, and guidelines.

License

This project is open source and available under the MIT License.

Acknowledgments

DeerFlow is built upon the incredible work of the open-source community. We are deeply grateful to all the projects and contributors whose efforts have made DeerFlow possible. Truly, we stand on the shoulders of giants.

We would like to extend our sincere appreciation to the following projects for their invaluable contributions:

  • LangChain: Their exceptional framework powers our LLM interactions and chains, enabling seamless integration and functionality.
  • LangGraph: Their innovative approach to multi-agent orchestration has been instrumental in enabling DeerFlow's sophisticated workflows.

These projects exemplify the transformative power of open-source collaboration, and we are proud to build upon their foundations.

Key Contributors

A heartfelt thank you goes out to the core authors of DeerFlow, whose vision, passion, and dedication have brought this project to life:

Your unwavering commitment and expertise have been the driving force behind DeerFlow's success. We are honored to have you at the helm of this journey.

Star History

Star History Chart

Description
No description provided
Readme MIT 24 MiB
Languages
Python 59.3%
TypeScript 24.3%
HTML 7.3%
CSS 3.2%
JavaScript 2.8%
Other 3.1%