# DeepSeek R1 Distill Qwen 32B DeepSeek R1 Distill Qwen 32B is a distilled large language model based on [Qwen 2.5 32B](https://huggingface.co/Qwen/Qwen2.5-32B), using outputs from [DeepSeek R1](/deepseek/deepseek-r1). It outperforms OpenAI's o1-mini across various benchmarks, achieving new state-of-the-art results for dense models.\n\nOther benchmark results include:\n\n- AIME 2024 pass@1: 72.6\n- MATH-500 pass@1: 94.3\n- CodeForces Rating: 1691\n\nThe model leverages fine-tuning from DeepSeek R1's outputs, enabling competitive performance comparable to larger frontier models. ## Model Information - **Organization**: [DeepSeek](/llm.txt) - **Slug**: deepseek-r1-distill-qwen-32b - **Available at Providers**: 25 - **Release Date**: January 20, 2025 ### Benchmark Scores - GPQA: 0.621 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [AIHubMix](/llm/aihubmix.txt) | DeepSeek-R1-Distill-Qwen-32B | 0.28 | 0.84 | | [View](https://aihubmix.com/model/DeepSeek-R1-Distill-Qwen-32B) | | [AIHubMix](/llm/aihubmix.txt) | DeepSeek-R1-Distill-Qwen-32B | 0.20 | 0.20 | | [View](https://aihubmix.com/model/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) | | [FastRouter](/llm/fastrouter.txt) | DeepSeek: R1 Distill Qwen 32B | 0.27 | 0.27 | | [View](https://fastrouter.ai/models/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B) | | [Vultr](/llm/vultr.txt) | DeepSeek R1 Distill Qwen 32B | 0.20 | 0.20 | | | | [Alibaba (China)](/llm/alibabacn.txt) | DeepSeek R1 Distill Qwen 32B | 0.29 | 0.86 | | | | [SiliconFlow (China)](/llm/siliconflowcn.txt) | deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | 0.18 | 0.18 | | | | [SiliconFlow](/llm/siliconflow.txt) | DeepSeek-R1-Distill-Qwen-32B | 0.18 | 0.18 | | [View](https://www.siliconflow.com./models/deepseek-r1-distill-qwen-32b) | | [Cloudflare AI Gateway](/llm/cloudflareaigateway.txt) | DeepSeek R1 Distill Qwen 32B | 0.50 | 4.88 | | | | [OpenRouter](/llm/openrouter.txt) | R1 Distill Qwen 32B | 0.29 | 0.29 | | [View](https://openrouter.ai/deepseek/deepseek-r1-distill-qwen-32b) | | [Requesty](/llm/requesty.txt) | | 0.30 | 0.30 | | | | [ValorGPT](/llm/valorgpt.txt) | R1 Distill Qwen 32B | | | | [View](https://www.valorgpt.com/models/deepseek-deepseek-r1-distill-qwen-32b) | | [Yupp](/llm/yupp.txt) | DeepSeek-R1 Distill Qwen 32B (OpenRouter) | | | | | | [Routeway](/llm/routeway.txt) | DeepSeek: R1 Distill Qwen 32B (Free) | 0.00 | 0.00 | Yes | [View](https://routeway.ai/models) | | [Routeway](/llm/routeway.txt) | DeepSeek: R1 Distill Qwen 32B | 0.30 | 2.93 | | [View](https://routeway.ai/models) | | [Glama](/llm/glama.txt) | deepseek-r1-distill-qwen-32b | 0.50 | 4.90 | | [View](https://glama.ai/gateway/models/deepseek-r1-distill-qwen-32b) | | [LangDB](/llm/langdb.txt) | deepseek-r1-distill-qwen-32b | | | | [View](https://langdb.ai/app/models) | | [Kilo Code](/llm/kilocode.txt) | DeepSeek: R1 Distill Qwen 32B | 0.29 | 0.29 | | | | [GMI Cloud](/llm/gmi.txt) | DeepSeek R1 Distill Qwen 32B | 0.50 | 0.90 | | | | [Nvidia](/llm/nvidia.txt) | deepseek-r1-distill-qwen-32b | | | | [View](https://build.nvidia.com/deepseek-ai/deepseek-r1-distill-qwen-32b) | | [Baidu AI Studio](/llm/baidu.txt) | | | | | [View](https://aistudio.baidu.com) | | [Cloudflare Workers AI](/llm/cloudflareworkersai.txt) | DeepSeek R1 Distill Qwen 32B | 0.50 | 4.88 | | | | [Blackbox AI](/llm/blackboxai.txt) | DeepSeek: R1 Distill Qwen 32B | 0.07 | 0.15 | | | | [ApiYI](/llm/apiyi.txt) | deepseek-r1-distill-qwen-32b | | | | | | [WaveSpeed AI](/llm/wavespeed.txt) | deepseek-r1-distill-qwen-32b | 0.32 | 0.32 | | | | [Writingmate](/llm/writingmate.txt) | DeepSeek: R1 Distill Qwen 32B | | | | [View](https://writingmate.ai/models/deepseek/deepseek-r1-distill-qwen-32b) | --- [← Back to all providers](/llm.txt)