mirror of
https://gitee.com/wanwujie/deer-flow
synced 2026-04-03 06:12:14 +08:00
* Add MiniMax as an OpenAI-compatible model provider MiniMax offers high-performance LLMs (M2.5, M2.5-highspeed) with 204K context windows. This commit adds MiniMax as a selectable provider in the configuration system. Changes: - Add MiniMax to SUPPORTED_MODELS with model definitions - Add MiniMax provider configuration in conf/config.yaml - Update documentation with MiniMax setup instructions Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * Update README to remove MiniMax API details Removed mention of MiniMax API usage and configuration examples. --------- Co-authored-by: octo-patch <octo-patch@users.noreply.github.com> Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com> Co-authored-by: Willem Jiang <willem.jiang@gmail.com>