Ministral 3B
mistralai/ministral-3b
Created Oct 17128,000 context
$0.04/M input tokens$0.04/M output tokens
Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.