# Ministral 3B Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference. ## Model Information - **Organization**: [Azure](/llm.txt) - **Slug**: ministral-3b - **Available at Providers**: 11 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [GitHub Models](/llm/githubmodels.txt) | Ministral 3B | 0.00 | 0.00 | Yes | | | [Azure OpenAI](/llm/azure.txt) | Ministral 3B | 0.04 | 0.04 | | | | [Azure AI Services](/llm/azurecognitiveservices.txt) | Ministral 3B | 0.04 | 0.04 | | | | [ValorGPT](/llm/valorgpt.txt) | Ministral 3B | | | | [View](https://www.valorgpt.com/models/mistralai-ministral-3b) | | [Vercel AI Gateway](/llm/vercel.txt) | Ministral 3B | 0.04 | 0.04 | | | | [Glama](/llm/glama.txt) | ministral-3b-2410 | 0.04 | 0.04 | | [View](https://glama.ai/gateway/models/ministral-3b-2410) | | [LangDB](/llm/langdb.txt) | ministral-3b | | | | [View](https://langdb.ai/app/models) | | [Blackbox AI](/llm/blackboxai.txt) | Mistral: Ministral 3B | 0.04 | 0.04 | | | | [Okara](/llm/okara.txt) | Ministral 3B | | | | [View](https://okara.ai/ai-models/ministral-3b) | | [WaveSpeed AI](/llm/wavespeed.txt) | ministral-3b | 0.04 | 0.04 | | | | [Writingmate](/llm/writingmate.txt) | Mistral: Ministral 3B | | | | [View](https://writingmate.ai/models/mistralai/ministral-3b) | --- [← Back to all providers](/llm.txt)