# deepseek-v3.2 DeepSeek-V3.2 is a large language model designed to harmonize high computational efficiency with strong reasoning and agentic tool-use performance. It introduces DeepSeek Sparse Attention (DSA), a fine-grained sparse attention mechanism that reduces training and inference cost while preserving quality in long-context scenarios. A scalable reinforcement learning post-training framework further improves reasoning, with reported performance in the GPT-5 class, and the model has demonstrated gold-medal results on the 2025 IMO and IOI. V3.2 also uses a large-scale agentic task synthesis pipeline to better integrate reasoning into tool-use settings, boosting compliance and generalization in interactive environments. Users can control the reasoning behaviour with the `reasoning` `enabled` boolean. [Learn more in our docs](https://openrouter.ai/docs/use-cases/reasoning-tokens#enable-reasoning-with-default-config) ## Model Information - **Organization**: [DeepSeek](/llm.txt) - **Slug**: deepseek-v3-2 - **Available at Providers**: 44 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [AIHubMix](/llm/aihubmix.txt) | deepseek-v3.2-fast | 1.10 | 3.29 | | [View](https://aihubmix.com/model/deepseek-v3.2-fast) | | [AIHubMix](/llm/aihubmix.txt) | deepseek-v3.2 | 0.30 | 0.45 | | [View](https://aihubmix.com/model/deepseek-v3.2) | | [AIHubMix](/llm/aihubmix.txt) | deepseek-v3.2-think | 0.30 | 0.45 | | [View](https://aihubmix.com/model/deepseek-v3.2-think) | | [FastRouter](/llm/fastrouter.txt) | DeepSeek: DeepSeek V3.2 | 0.27 | 0.40 | | [View](https://fastrouter.ai/models/deepseek/deepseek-v3.2) | | [Fireworks AI](/llm/fireworks.txt) | Deepseek v3.2 | 0.56 | 1.68 | | | | [Helicone](/llm/helicone.txt) | DeepSeek V3.2 | 0.26 | 0.40 | | [View](https://www.helicone.ai/model/deepseek-v3.2) | | [Mammouth AI](/llm/mammouth.txt) | deepseek-v3.2 | 0.30 | 0.45 | | | | [Abacus](/llm/abacus.txt) | DeepSeek V3.2 | 0.27 | 0.40 | | | | [Novita AI](/llm/novita.txt) | deepseek-v3.2 | 0.27 | 0.40 | | | | [Azure OpenAI](/llm/azure.txt) | DeepSeek-V3.2 | 0.58 | 1.68 | | | | [Baseten](/llm/baseten.txt) | DeepSeek V3.2 | 0.30 | 0.45 | | | | [SiliconFlow](/llm/siliconflow.txt) | DeepSeek-V3.2 | 0.27 | 0.42 | | [View](https://www.siliconflow.com./models/deepseek-v3-2) | | [Hugging Face](/llm/huggingface.txt) | DeepSeek-V3.2 | 0.28 | 0.40 | | | | [iFlow](/llm/iflowcn.txt) | DeepSeek-V3.2-Exp | 0.00 | 0.00 | Yes | | | [Azure AI Services](/llm/azurecognitiveservices.txt) | DeepSeek-V3.2 | 0.58 | 1.68 | | | | [Nano-GPT](/llm/nanogpt.txt) | DeepSeek V3.2 | | | | | | [Nano-GPT](/llm/nanogpt.txt) | DeepSeek V3.2 Thinking | | | | | | [Nano-GPT](/llm/nanogpt.txt) | DeepSeek V3.2 TEE | | | | | | [Ollama Cloud](/llm/ollama.txt) | deepseek-v3.2 | | | | [View](https://ollama.com/library/deepseek-v3.2) | | [OpenRouter](/llm/openrouter.txt) | DeepSeek V3.2 | 0.25 | 0.40 | | [View](https://openrouter.ai/deepseek/deepseek-v3.2-20251201) | | [Poe](/llm/poe.txt) | DeepSeek-V3.2 | 0.27 | 0.40 | | [View](https://poe.com/deepseek-v3.2/api) | | [Synthetic.new](/llm/synthetic.txt) | deepseek-ai/DeepSeek-V3.2 | 0.56 | 1.68 | | | | [ValorGPT](/llm/valorgpt.txt) | DeepSeek V3.2 | | | | [View](https://www.valorgpt.com/models/deepseek-deepseek-v3.2) | | [Venice](/llm/venice.txt) | DeepSeek V3.2 | 0.40 | 1.00 | | | | [Vercel AI Gateway](/llm/vercel.txt) | DeepSeek V3.2 | 0.26 | 0.38 | | | | [Vivgrid](/llm/vivgrid.txt) | | 0.28 | 0.42 | | | | [Yupp](/llm/yupp.txt) | DeepSeek V3.2 Chat (OpenRouter) | | | | | | [Yupp](/llm/yupp.txt) | DeepSeek V3.2 (Sambanova) | | | | | | [ZenMUX](/llm/zenmux.txt) | DeepSeek: DeepSeek V3.2 | 0.28 | 0.43 | | | | [Atlas Cloud](/llm/atlascloud.txt) | DeepSeek V3.2 | 0.26 | 0.38 | | [View](https://www.atlascloud.ai/models/deepseek-ai/deepseek-v3.2) | | [DeepInfra](/llm/deepinfra.txt) | DeepSeek-V3.2 | 0.26 | 0.38 | | | | [Nebius Token Factory](/llm/nebius.txt) | DeepSeek-V3.2 | 0.30 | 0.45 | | [View](https://huggingface.co/deepseek-ai/DeepSeek-V3.2) | | [Routeway](/llm/routeway.txt) | DeepSeek: DeepSeek V3.2 | 0.25 | 0.39 | | [View](https://routeway.ai/models) | | [Nvidia](/llm/nvidia.txt) | deepseek-v3.2 | 0.00 | 0.00 | Yes | [View](https://build.nvidia.com/deepseek-ai/deepseek-v3.2) | | [Yupp](/llm/yupp.txt) | DeepSeek V3.2 Chat (Baseten) | | | | | | [302.AI](/llm/302ai.txt) | deepseek-v3.2 | 0.29 | 0.43 | | [View](https://302ai-en.apifox.cn/api-207705121) | | [Kilo Code](/llm/kilocode.txt) | DeepSeek: DeepSeek V3.2 | 0.25 | 0.38 | | | | [GMI Cloud](/llm/gmi.txt) | DeepSeek-V3.2 | 0.28 | 0.40 | | | | [Parasail](/llm/parasail.txt) | Deepseek V32 | 0.28 | 0.45 | | [View](https://www.saas.parasail.io/pricing) | | [RedPill](/llm/redpill.txt) | DeepSeek: DeepSeek V3.2 | 0.27 | 0.40 | | | | [SambaNova AI](/llm/sambanova.txt) | DeepSeek-V3.2 | 3.00 | 4.50 | | | | [IO.NET](/llm/ionet.txt) | DeepSeek-V3.2 | 0.25 | 0.38 | | | | [Google Vertex AI](/llm/googlevertex.txt) | Deepseek V32 | | | | | | [Baidu AI Studio](/llm/baidu.txt) | | | | | [View](https://aistudio.baidu.com) | --- [← Back to all providers](/llm.txt)