Mistral: Ministral 8B

mistralai/ministral-8b

Created Oct 17, 2024128,000 context
$0.10/M input tokens$0.10/M output tokens

Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications.

Recent activity on Ministral 8B

Tokens processed per day

Jan 29Feb 4Feb 10Feb 16Feb 22Feb 28Mar 6Mar 12Mar 18Mar 24Mar 30Apr 5Apr 11Apr 17Apr 23Apr 29055M110M165M220M
    Mistral: Ministral 8B – Recent Activity | OpenRouter