# Ministral 8B Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks. It outperforms peers in the sub-10B category, making it perfect for low-latency, privacy-first applications. ## Model Information - **Organization**: [Mistral](/llm.txt) - **Slug**: ministral-8b - **Available at Providers**: 7 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [ValorGPT](/llm/valorgpt.txt) | Ministral 8B | | | | [View](https://www.valorgpt.com/models/mistralai-ministral-8b) | | [Vercel AI Gateway](/llm/vercel.txt) | Ministral 8B | 0.10 | 0.10 | | | | [LangDB](/llm/langdb.txt) | ministral-8b | | | | [View](https://langdb.ai/app/models) | | [Blackbox AI](/llm/blackboxai.txt) | Mistral: Ministral 8B | 0.10 | 0.10 | | | | [Okara](/llm/okara.txt) | Ministral 8B | | | | [View](https://okara.ai/ai-models/ministral-8b) | | [WaveSpeed AI](/llm/wavespeed.txt) | ministral-8b | 0.11 | 0.11 | | | | [Writingmate](/llm/writingmate.txt) | Mistral: Ministral 8B | | | | [View](https://writingmate.ai/models/mistralai/ministral-8b) | --- [← Back to all providers](/llm.txt)