Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Mistral: Ministral 8B

    mistralai/ministral-8b

    Created Oct 17, 2024131,072 context
    $0.10/M input tokens$0.10/M output tokens

    Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

    Providers for Ministral 8B

    OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.