Mistral

Mixtral 8x7B Instruct

Mistral mixtral-8x7b-instruct
Model Information
Slug mixtral-8x7b-instruct
LLM.txt View
Release Date December 10, 2023
Organization
Model Description
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.

Instruct model fine-tuned by Mistral. #moe
Available at 9 Providers
Provider Type Model Name Original Model Input ($/1M) Output ($/1M) Free Actions
Blackbox AI
Blackbox AI
Code
Mistral: Mixtral 8x7B Instruct
mistralai/mixtral-8x7b-instruct $0.08 $0.24
Fireworks AI
Fireworks AI
Mixtral MoE 8x7B Instruct
mixtral-8x7b-instruct $0.50 $0.50
OpenRouter
OpenRouter
Chat Code
Mixtral 8x7B Instruct
mistralai/mixtral-8x7b-instruct $0.54 $0.54
Kilo Code
Kilo Code
Code
Mistral: Mixtral 8x7B Instruct
mistralai/mixtral-8x7b-instruct $0.54 $0.54
WaveSpeed AI
WaveSpeed AI
Chat Code
mixtral-8x7b-instruct
mistralai/mixtral-8x7b-instruct $0.59 $0.59
ValorGPT
ValorGPT
Mixtral 8x7B Instruct
mistralai-mixtral-8x7b-instruct - -
Yupp
Yupp
Chat
Mixtral 8x7B Instruct (OpenRouter)
mistralai/mixtral-8x7b-instruct - -
LangDB
LangDB
mixtral-8x7b-instruct
mixtral-8x7b-instruct - -
Writingmate
Writingmate
Chat Code
Mistral: Mixtral 8x7B Instruct
mistralai/mixtral-8x7b-instruct - -