# o4-mini OpenAI o4-mini is a compact reasoning model in the o-series, optimized for fast, cost-efficient performance while retaining strong multimodal and agentic capabilities. It supports tool use and demonstrates competitive reasoning and coding performance across benchmarks like AIME (99.5% with Python) and SWE-bench, outperforming its predecessor o3-mini and even approaching o3 in some domains. Despite its smaller size, o4-mini exhibits high accuracy in STEM tasks, visual problem solving (e.g., MathVista, MMMU), and code editing. It is especially well-suited for high-throughput scenarios where latency or cost is critical. Thanks to its efficient architecture and refined reinforcement learning training, o4-mini can chain tools, generate structured outputs, and solve multi-step tasks with minimal delay—often in under a minute. ## Model Information - **Organization**: [OpenAI](/llm.txt) - **Slug**: o4-mini - **Available at Providers**: 45 - **Release Date**: April 16, 2025 ### Benchmark Scores - AIME 2025: 0.927 - HLE: 0.147 - GPQA: 0.814 - SWE Bench: 0.681 - Browsecomp: 0.515 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [AIHubMix](/llm/aihubmix.txt) | o4-mini | 1.10 | 4.40 | | [View](https://aihubmix.com/model/o4-mini) | | [AIMLAPI](/llm/aimlapi.txt) | o4-mini | | | | | | [FastRouter](/llm/fastrouter.txt) | OpenAI: o4 Mini | 1.10 | 4.40 | | [View](https://fastrouter.ai/models/openai/o4-mini) | | [Helicone](/llm/helicone.txt) | OpenAI o4 Mini | 1.10 | 4.40 | | [View](https://www.helicone.ai/model/o4-mini) | | [Abacus](/llm/abacus.txt) | o4-mini | 1.10 | 4.40 | | | | [GitHub Models](/llm/githubmodels.txt) | OpenAI o4-mini | 0.00 | 0.00 | Yes | | | [Azure OpenAI](/llm/azure.txt) | o4-mini | 1.10 | 4.40 | | | | [Cloudflare AI Gateway](/llm/cloudflareaigateway.txt) | o4-mini | 1.10 | 4.40 | | | | [OpenAI](/llm/openai.txt) | o4-mini | 1.10 | 4.40 | | | | [Azure AI Services](/llm/azurecognitiveservices.txt) | o4-mini | 1.10 | 4.40 | | | | [Monica AI](/llm/monica.txt) | o4-mini | 0.55 | 2.20 | | | | [Nano-GPT](/llm/nanogpt.txt) | OpenAI o4-mini | | | | | | [OpenRouter](/llm/openrouter.txt) | o4 Mini | 1.10 | 4.40 | | [View](https://openrouter.ai/openai/o4-mini-2025-04-16) | | [Poe](/llm/poe.txt) | o4-mini | 0.99 | 4.00 | | [View](https://poe.com/o4-mini/api) | | [Replicate](/llm/replicate.txt) | o4-mini | | | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 0.55 | 2.20 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [ValorGPT](/llm/valorgpt.txt) | o4 Mini | | | | [View](https://www.valorgpt.com/models/openai-o4-mini) | | [Vercel AI Gateway](/llm/vercel.txt) | o4-mini | 1.10 | 4.40 | | | | [Yupp](/llm/yupp.txt) | OpenAI o4-mini Search | | | | | | [Yupp](/llm/yupp.txt) | OpenAI o4-mini (OpenRouter) | | | | | | [ZenMUX](/llm/zenmux.txt) | OpenAI: o4 Mini | 1.10 | 4.40 | | | | [Glama](/llm/glama.txt) | o4-mini-2025-04-16 | 1.10 | 4.40 | | [View](https://glama.ai/gateway/models/o4-mini-2025-04-16) | | [LangDB](/llm/langdb.txt) | o4-mini | | | | [View](https://langdb.ai/app/models) | | [JetBrains AI](/llm/jetbrains.txt) | o4-mini | | | | [View](https://www.jetbrains.com/help/ai-assistant/supported-llms.html) | | [Requesty](/llm/requesty.txt) | | 1.10 | 4.40 | | | | [Kilo Code](/llm/kilocode.txt) | OpenAI: o4 Mini | 1.10 | 4.40 | | | | [RedPill](/llm/redpill.txt) | OpenAI: o4 Mini | 1.10 | 4.40 | | | | [Blackbox AI](/llm/blackboxai.txt) | OpenAI: o4 Mini | 1.10 | 4.40 | | | | [Arena AI](/llm/arenaai.txt) | | | | | | | [Jiekou.AI](/llm/jiekou.txt) | o4-mini | 1.10 | 4.40 | | | | [CometAPI](/llm/cometapi.txt) | o4-mini | 0.88 | 3.52 | | | | [ApiYI](/llm/apiyi.txt) | o4-mini-2025-04-16 | | | | | | [ApiYI](/llm/apiyi.txt) | o4-mini | | | | | | [WaveSpeed AI](/llm/wavespeed.txt) | o4-mini | 1.21 | 4.84 | | | | [Writingmate](/llm/writingmate.txt) | OpenAI: o4 Mini | | | | [View](https://writingmate.ai/models/openai/o4-mini) | --- [← Back to all providers](/llm.txt)