# GPT-5 mini GPT-5 Mini is a compact version of GPT-5, designed to handle lighter-weight reasoning tasks. It provides the same instruction-following and safety-tuning benefits as GPT-5, but with reduced latency and cost. GPT-5 Mini is the successor to OpenAI's o4-mini model. ## Model Information - **Organization**: [OpenAI](/llm.txt) - **Slug**: gpt-5-mini - **Available at Providers**: 53 - **Release Date**: August 7, 2025 ### Benchmark Scores - Weekly: 0.84 - AIME 2025: 0.911 - HLE: 0.167 - GPQA: 0.823 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [AIHubMix](/llm/aihubmix.txt) | gpt-5-mini | 0.25 | 2.00 | | [View](https://aihubmix.com/model/gpt-5-mini) | | [AIMLAPI](/llm/aimlapi.txt) | GPT-5 mini | 0.33 | 2.60 | | | | [FastRouter](/llm/fastrouter.txt) | OpenAI: GPT-5-Mini | 0.25 | 2.00 | | [View](https://fastrouter.ai/models/openai/gpt-5-mini) | | [Helicone](/llm/helicone.txt) | OpenAI GPT-5 Mini | 0.25 | 2.00 | | [View](https://www.helicone.ai/model/gpt-5-mini) | | [Helicone](/llm/helicone.txt) | OpenAI GPT-5 Mini | 0.25 | 2.00 | | [View](https://www.helicone.ai/model/gpt-5-mini-2025-08-07) | | [Mammouth AI](/llm/mammouth.txt) | gpt-5-mini | 0.25 | 2.00 | | | | [Firmware](/llm/firmware.txt) | GPT-5 Mini | 0.25 | 2.00 | | | | [GitHub Copilot](/llm/githubcopilot.txt) | GPT-5-mini | 0.00 | 0.00 | Yes | | | [Abacus](/llm/abacus.txt) | GPT-5 Mini | 0.25 | 2.00 | | | | [Azure OpenAI](/llm/azure.txt) | GPT-5 Mini | 0.25 | 2.00 | | | | [OpenAI](/llm/openai.txt) | GPT-5 Mini | 0.25 | 2.00 | | | | [SAP AI Core](/llm/sapaicore.txt) | gpt-5-mini | 0.25 | 2.00 | | | | [Azure AI Services](/llm/azurecognitiveservices.txt) | GPT-5 Mini | 0.25 | 2.00 | | | | [Nano-GPT](/llm/nanogpt.txt) | GPT 5 Mini | | | | | | [OpenRouter](/llm/openrouter.txt) | GPT-5 Mini | 0.25 | 2.00 | | [View](https://openrouter.ai/openai/gpt-5-mini-2025-08-07) | | [Poe](/llm/poe.txt) | GPT-5-mini | 0.22 | 1.80 | | [View](https://poe.com/gpt-5-mini/api) | | [Replicate](/llm/replicate.txt) | gpt-5-mini | | | | | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Requesty](/llm/requesty.txt) | | 0.45 | 3.60 | | | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Requesty](/llm/requesty.txt) | | 0.13 | 1.00 | | | | [ValorGPT](/llm/valorgpt.txt) | GPT-5 Mini | | | | [View](https://www.valorgpt.com/models/openai-gpt-5-mini) | | [Vercel AI Gateway](/llm/vercel.txt) | GPT-5 mini | 0.25 | 2.00 | | | | [Vivgrid](/llm/vivgrid.txt) | | 0.25 | 2.00 | | | | [Yupp](/llm/yupp.txt) | GPT-5 Mini High Search | | | | | | [Yupp](/llm/yupp.txt) | GPT-5 Mini | | | | | | [ZenMUX](/llm/zenmux.txt) | OpenAI: GPT-5 Mini | 0.25 | 2.00 | | | | [Routeway](/llm/routeway.txt) | OpenAI: GPT-5 Mini | 0.13 | 1.00 | | [View](https://routeway.ai/models) | | [Glama](/llm/glama.txt) | gpt-5-mini-2025-08-07 | 0.25 | 2.00 | | [View](https://glama.ai/gateway/models/gpt-5-mini-2025-08-07) | | [LangDB](/llm/langdb.txt) | gpt-5-mini | | | | [View](https://langdb.ai/app/models) | | [Zed](/llm/zed.txt) | GPT-5 mini | 0.28 | 2.20 | | [View](https://zed.dev/docs/ai/models) | | [JetBrains AI](/llm/jetbrains.txt) | GPT-5 mini | | | | [View](https://www.jetbrains.com/help/ai-assistant/supported-llms.html) | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [302.AI](/llm/302ai.txt) | gpt-5-mini | 0.25 | 6.00 | | [View](https://302ai-en.apifox.cn/api-207705102) | | [Requesty](/llm/requesty.txt) | | 0.25 | 2.00 | | | | [Kilo Code](/llm/kilocode.txt) | OpenAI: GPT-5 Mini | 0.25 | 2.00 | | | | [Roo Code](/llm/roocode.txt) | GPT-5 mini | 0.25 | 2.00 | | [View](https://roocode.com) | | [RedPill](/llm/redpill.txt) | OpenAI: GPT-5 Mini | 0.25 | 2.00 | | | | [302.AI](/llm/302ai.txt) | gpt-5-mini-2025-08-07 | 0.25 | 6.00 | | [View](https://302ai-en.apifox.cn/api-207705102) | | [Perplexity AI](/llm/perplexity.txt) | GPT-5 Mini | 0.25 | 2.00 | | [View](https://docs.perplexity.ai/docs/grounded-llm/responses/models) | | [Jiekou.AI](/llm/jiekou.txt) | gpt-5-mini | 0.23 | 1.80 | | | | [Blackbox AI](/llm/blackboxai.txt) | blackboxai/openai/gpt-5-mini | | | | | | [CometAPI](/llm/cometapi.txt) | GPT-5 mini | 0.20 | 1.60 | | | | [QiHang](/llm/qihangai.txt) | GPT-5-Mini | 0.04 | 0.29 | | | | [ApiYI](/llm/apiyi.txt) | gpt-5-mini | | | | | | [ApiYI](/llm/apiyi.txt) | gpt-5-mini-2025-08-07 | | | | | | [WaveSpeed AI](/llm/wavespeed.txt) | gpt-5-mini | 0.28 | 2.20 | | | | [MegaNova](/llm/meganova.txt) | GPT-5-Mini | 0.20 | 1.60 | | [View](https://console.meganova.ai/serverless/openai/gpt-5-mini) | | [Writingmate](/llm/writingmate.txt) | OpenAI: GPT-5 Mini | | | | [View](https://writingmate.ai/models/openai/gpt-5-mini) | | [LLM Stats](/llm/llmstats.txt) | GPT-5 mini | | | | | --- [← Back to all providers](/llm.txt)