Mistral
•
mixtral-8x22b
| Slug |
mixtral-8x22b
|
|---|---|
| Aliases | mixtral-8x22b mixtral8x22b |
| Name | Mistral |
|---|---|
| Website | https://mistral.ai/ |
Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time. It was released via [X](https://twitter.com/MistralAI/status/1777869263778291896). #moe
| Provider | Model Name | Original Model | Input ($/1M) | Output ($/1M) | Free | ||
|---|---|---|---|---|---|---|---|
|
|
Fireworks AI | Mixtral Moe 8x22B |
mixtral-8x22b
|
$1.20 | - | Visit |