Mistral

Mixtral 8x22B (base)

Mistral mixtral-8x22b

Model Information
Slug mixtral-8x22b
Aliases mixtral-8x22b mixtral8x22b
Organization
Name Mistral
Website https://mistral.ai/

Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time. It was released via [X](https://twitter.com/MistralAI/status/1777869263778291896). #moe

Available at 1 Provider
Provider Model Name Original Model Input ($/1M) Output ($/1M) Free
Fireworks AI icon Fireworks AI Mixtral Moe 8x22B mixtral-8x22b $1.20 - Visit