Model Information
| Slug | mixtral-8x22b |
|---|---|
| LLM.txt | View |
Organization
| Name | Mistral |
|---|---|
| Website | https://mistral.ai/ |
Model Description
Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.
It was released via [X](https://twitter.com/MistralAI/status/1777869263778291896).
#moe
It was released via [X](https://twitter.com/MistralAI/status/1777869263778291896).
#moe
Available at 2 Providers
| Provider | Type | Model Name | Original Model | Input ($/1M) | Output ($/1M) | Free | Actions | |
|---|---|---|---|---|---|---|---|---|
|
|
Fireworks AI |
Mixtral Moe 8x22B
|
mixtral-8x22b
|
$1.20 | $1.20 | |||
|
|
Writingmate |
Chat
Code
|
Mistral: Mixtral 8x22B (base)
|
mistralai/mixtral-8x22b
|
- | - |