AI21: Jamba Mini 1.6

ai21/jamba-1.6-mini

Created Mar 13, 2025256,000 context
$0.2/M input tokens$0.4/M output tokens

AI21 Jamba Mini 1.6 is a hybrid foundation model combining State Space Models (Mamba) with Transformer attention mechanisms. With 12 billion active parameters (52 billion total), this model excels in extremely long-context tasks (up to 256K tokens) and achieves superior inference efficiency, outperforming comparable open models on tasks such as retrieval-augmented generation (RAG) and grounded question answering. Jamba Mini 1.6 supports multilingual tasks across English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew, along with structured JSON output and tool-use capabilities.

Usage of this model is subject to the Jamba Open Model License.

    Jamba Mini 1.6 - API, Providers, Stats | OpenRouter