Skip to content
  •  
  • © 2023 – 2025 OpenRouter, Inc
      Favicon for Minimax

      Minimax

      Browse models provided by Minimax (Terms of Service)

      2 models

      Tokens processed

      • DeepSeek: R1

        DeepSeek R1 is here: Performance on par with OpenAI o1, but open-sourced and with fully open reasoning tokens. It's 671B parameters in size, with 37B active in an inference pass. Fully open-source model & technical report. MIT licensed: Distill & commercialize freely!

        by deepseek164K context$0.55/M input tokens$2.19/M output tokens
      • MiniMax: MiniMax-01

        MiniMax-01 is a combines MiniMax-Text-01 for text generation and MiniMax-VL-01 for image understanding. It has 456 billion parameters, with 45.9 billion parameters activated per inference, and can handle a context of up to 4 million tokens. The text model adopts a hybrid architecture that combines Lightning Attention, Softmax Attention, and Mixture-of-Experts (MoE). The image model adopts the “ViT-MLP-LLM” framework and is trained on top of the text model. To read more about the release, see: https://www.minimaxi.com/en/news/minimax-01-series-2

        by minimax1M context$0.20/M input tokens$1.10/M output tokens