Mistral: Mixtral 8x22B (base)

mistralai/mixtral-8x22b

Created Apr 10, 202465,536 context

Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.

It was released via X.

#moe

    Mixtral 8x22B (base) - API, Providers, Stats | OpenRouter