Skip to content
  1.  
  2. © 2023 – 2025 OpenRouter, Inc

    Mistral: Ministral 8B

    mistralai/ministral-8b

    Created Oct 17, 2024131,072 context
    $0.10/M input tokens$0.10/M output tokens

    Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

    Providers for Ministral 8B

    OpenRouter routes requests to the best providers that are able to handle your prompt size and parameters, with fallbacks to maximize uptime.

    Performance for Ministral 8B

    Compare different providers across OpenRouter

    Apps using Ministral 8B

    Top public apps this week using this model

    Recent activity on Ministral 8B

    Total usage per day on OpenRouter

    Uptime stats for Ministral 8B

    Uptime stats for Ministral 8B across all providers

    Sample code and API for Ministral 8B

    OpenRouter normalizes requests and responses across providers for you.

    OpenRouter provides an OpenAI-compatible completion API to 400+ models & providers that you can call directly, or using the OpenAI SDK. Additionally, some third-party SDKs are available.

    In the examples below, the OpenRouter-specific headers are optional. Setting them allows your app to appear on the OpenRouter leaderboards.

    Using third-party SDKs

    For information about using third-party SDKs and frameworks with OpenRouter, please see our frameworks documentation.

    See the Request docs for all possible fields, and Parameters for explanations of specific sampling parameters.