# DeepSeek-R1 DeepSeek R1 is here: Performance on par with [OpenAI o1](/openai/o1), but open-sourced and with fully open reasoning tokens. It's 671B parameters in size, with 37B active in an inference pass. Fully open-source model & [technical report](https://api-docs.deepseek.com/news/news250120). MIT licensed: Distill & commercialize freely! ## Model Information - **Organization**: [DeepSeek](/llm.txt) - **Slug**: deepseek-r1 - **Available at Providers**: 41 - **Release Date**: January 20, 2025 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [AIHubMix](/llm/aihubmix.txt) | DeepSeek-R1 | 0.40 | 2.00 | | [View](https://aihubmix.com/model/DeepSeek-R1) | | [FastRouter](/llm/fastrouter.txt) | DeepSeek: R1 | 0.50 | 2.18 | | [View](https://fastrouter.ai/models/deepseek-ai/DeepSeek-R1) | | [Abacus](/llm/abacus.txt) | DeepSeek R1 | 3.00 | 7.00 | | | | [Alibaba (China)](/llm/alibabacn.txt) | DeepSeek R1 | 0.57 | 2.29 | | | | [GitHub Models](/llm/githubmodels.txt) | DeepSeek-R1 | 0.00 | 0.00 | Yes | | | [Together AI](/llm/togetherai.txt) | DeepSeek R1-0528 | 3.00 | 7.00 | | | | [Azure OpenAI](/llm/azure.txt) | DeepSeek-R1 | 1.35 | 5.40 | | | | [SiliconFlow](/llm/siliconflow.txt) | DeepSeek-R1 | 0.50 | 2.18 | | [View](https://www.siliconflow.com./models/deepseek-r1) | | [iFlow](/llm/iflowcn.txt) | DeepSeek-R1 | 0.00 | 0.00 | Yes | | | [Azure AI Services](/llm/azurecognitiveservices.txt) | DeepSeek-R1 | 1.35 | 5.40 | | | | [Nano-GPT](/llm/nanogpt.txt) | DeepSeek R1 | | | | | | [OpenRouter](/llm/openrouter.txt) | R1 | 0.70 | 2.50 | | [View](https://openrouter.ai/deepseek/deepseek-r1) | | [Poe](/llm/poe.txt) | DeepSeek-R1 | 18000.00 | | | [View](https://poe.com/deepseek-r1/api) | | [Replicate](/llm/replicate.txt) | deepseek-r1 | | | | | | [Requesty](/llm/requesty.txt) | | 3.00 | 7.00 | | | | [Requesty](/llm/requesty.txt) | | 4.00 | 4.00 | | | | [Requesty](/llm/requesty.txt) | | 0.85 | 2.50 | | | | [ValorGPT](/llm/valorgpt.txt) | R1 | | | | [View](https://www.valorgpt.com/models/deepseek-deepseek-r1) | | [Vercel AI Gateway](/llm/vercel.txt) | DeepSeek R1 0528 | 1.35 | 5.40 | | | | [Vivgrid](/llm/vivgrid.txt) | | 1.35 | 5.40 | | | | [Yupp](/llm/yupp.txt) | DeepSeek-R1 (Together AI) | | | | | | [Yupp](/llm/yupp.txt) | DeepSeek-R1 (OpenRouter) | | | | | | [Yupp](/llm/yupp.txt) | DeepSeek-R1 (Sambanova) | | | | | | [Glama](/llm/glama.txt) | deepseek-r1 | 0.55 | 2.20 | | [View](https://glama.ai/gateway/models/deepseek-r1) | | [LangDB](/llm/langdb.txt) | DeepSeek-R1 | | | | [View](https://langdb.ai/app/models) | | [LangDB](/llm/langdb.txt) | deepseek-r1-05-28 | | | | [View](https://langdb.ai/app/models) | | [LangDB](/llm/langdb.txt) | deepseek-r1 | | | | [View](https://langdb.ai/app/models) | | [Kilo Code](/llm/kilocode.txt) | DeepSeek: R1 | 0.70 | 2.50 | | | | [GMI Cloud](/llm/gmi.txt) | DeepSeek R1 | 0.50 | 2.18 | | | | [Baidu AI Studio](/llm/baidu.txt) | | | | | [View](https://aistudio.baidu.com) | | [Baidu AI Studio](/llm/baidu.txt) | | | | | [View](https://aistudio.baidu.com) | | [Inference](/llm/inference.txt) | | | | | | | [Blackbox AI](/llm/blackboxai.txt) | DeepSeek: R1 | 0.45 | 2.15 | | | | [Qiniu](/llm/qiniuai.txt) | DeepSeek-R1 | | | | | | [ApiYI](/llm/apiyi.txt) | deepseek-r1-250528 | | | | | | [ApiYI](/llm/apiyi.txt) | deepseek-r1 | | | | | | [WaveSpeed AI](/llm/wavespeed.txt) | deepseek-r1 | 0.77 | 2.75 | | | | [Routeway](/llm/routeway.txt) | DeepSeek: R1 | 0.37 | 1.56 | | [View](https://routeway.ai/models) | | [Airforce API](/llm/airforce.txt) | deepseek-r1 | | | | | | [Writingmate](/llm/writingmate.txt) | DeepSeek: R1 | | | | [View](https://writingmate.ai/models/deepseek/deepseek-r1) | | [SiliconFlow (China)](/llm/siliconflowcn.txt) | Pro/deepseek-ai/DeepSeek-R1 | 0.50 | 2.18 | | | --- [← Back to all providers](/llm.txt)