# SiliconFlow SiliconFlow is a lightning-fast AI platform for developers that provides access to 200+ optimized LLMs and multimodal models through simple APIs. The platform enables deployment, fine-tuning, and running of models including DeepSeek, Qwen, Llama, GLM, FLUX, and many others. SiliconFlow focuses on providing fast, cost-effective AI infrastructure with support for text generation, image generation, video generation, audio models, embeddings, and rerankers. ## Provider Information - **Website**: - **Available Models**: 91 ## Models | Name | Original Name | $ Input Price (per 1M) | $ Output Price (per 1M) | Free | Link | |------|---------------|---------------------|----------------------|------|------| | Ling-mini-2.0 | inclusionAI/Ling-mini-2.0 | 0.07 | 0.28 | | [View](https://www.siliconflow.com./models/ling-mini-2-0) | | Ling-flash-2.0 | inclusionAI/Ling-flash-2.0 | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/ling-flash-2-0) | | Ring-flash-2.0 | inclusionAI/Ring-flash-2.0 | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/ring-flash-2-0) | | Kimi-K2-Instruct | moonshotai/Kimi-K2-Instruct | 0.58 | 2.29 | | [View](https://www.siliconflow.com./models/kimi-k2-instruct) | | Kimi-K2-Instruct-0905 | moonshotai/Kimi-K2-Instruct-0905 | 0.40 | 2.00 | | [View](https://www.siliconflow.com./models/kimi-k2-instruct-0905) | | Kimi-K2-Thinking | moonshotai/Kimi-K2-Thinking | 0.55 | 2.50 | | [View](https://www.siliconflow.com./models/kimi-k2-thinking) | | Hunyuan-MT-7B | tencent/Hunyuan-MT-7B | 0.00 | 0.00 | Yes | [View](https://www.siliconflow.com./models/hunyuan-mt-7b) | | Hunyuan-A13B-Instruct | tencent/Hunyuan-A13B-Instruct | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/hunyuan-a13b-instruct) | | GLM-4-32B-0414 | THUDM/GLM-4-32B-0414 | 0.27 | 0.27 | | [View](https://www.siliconflow.com./models/glm-4-32b-0414) | | GLM-Z1-9B-0414 | THUDM/GLM-Z1-9B-0414 | 0.09 | 0.09 | | [View](https://www.siliconflow.com./models/glm-z1-9b-0414) | | GLM-4-9B-0414 | THUDM/GLM-4-9B-0414 | 0.09 | 0.09 | | [View](https://www.siliconflow.com./models/glm-4-9b-0414) | | GLM-Z1-32B-0414 | THUDM/GLM-Z1-32B-0414 | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/glm-z1-32b-0414) | | gpt-oss-20b | openai/gpt-oss-20b | | | | | | gpt-oss-120b | openai/gpt-oss-120b | | | | | | DeepSeek-V3.1-Nex-N1 | nex-agi/DeepSeek-V3.1-Nex-N1 | 0.27 | 1.00 | | [View](https://www.siliconflow.com./models/deepseek-v3-1-nex-n1) | | ERNIE-4.5-300B-A47B | baidu/ERNIE-4.5-300B-A47B | 0.28 | 1.10 | | [View](https://www.siliconflow.com./models/ernie-4-5-300b-a47b) | | Seed-OSS-36B-Instruct | ByteDance-Seed/Seed-OSS-36B-Instruct | 0.21 | 0.57 | | [View](https://www.siliconflow.com./models/seed-oss-36b-instruct) | | Meta-Llama-3.1-8B-Instruct | meta-llama/Meta-Llama-3.1-8B-Instruct | 0.06 | 0.06 | | [View](https://www.siliconflow.com./models/meta-llama-3-1-8b-instruct) | | Qwen3-30B-A3B-Thinking-2507 | Qwen/Qwen3-30B-A3B-Thinking-2507 | 0.09 | 0.30 | | [View](https://www.siliconflow.com./models/qwen3-30b-a3b-thinking-2507) | | Qwen3-VL-30B-A3B-Instruct | Qwen/Qwen3-VL-30B-A3B-Instruct | 0.29 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-vl-30b-a3b-instruct) | | Qwen3-14B | Qwen/Qwen3-14B | 0.07 | 0.28 | | [View](https://www.siliconflow.com./models/qwen3-14b) | | Qwen2.5-VL-32B-Instruct | Qwen/Qwen2.5-VL-32B-Instruct | 0.27 | 0.27 | | [View](https://www.siliconflow.com./models/qwen2-5-vl-32b-instruct) | | Qwen3-Omni-30B-A3B-Captioner | Qwen/Qwen3-Omni-30B-A3B-Captioner | 0.10 | 0.40 | | [View](https://www.siliconflow.com./models/qwen3-omni-30b-a3b-captioner) | | Qwen3-8B | Qwen/Qwen3-8B | 0.06 | 0.06 | | [View](https://www.siliconflow.com./models/qwen3-8b) | | Qwen3-Omni-30B-A3B-Instruct | Qwen/Qwen3-Omni-30B-A3B-Instruct | 0.10 | 0.40 | | [View](https://www.siliconflow.com./models/qwen3-omni-30b-a3b-instruct) | | Qwen3-VL-8B-Thinking | Qwen/Qwen3-VL-8B-Thinking | 0.18 | 2.00 | | [View](https://www.siliconflow.com./models/qwen3-vl-8b-thinking) | | Qwen3-235B-A22B-Instruct-2507 | Qwen/Qwen3-235B-A22B-Instruct-2507 | 0.09 | 0.60 | | [View](https://www.siliconflow.com./models/qwen3-235b-a22b-instruct-2507) | | Qwen2.5-Coder-32B-Instruct | Qwen/Qwen2.5-Coder-32B-Instruct | 0.18 | 0.18 | | [View](https://www.siliconflow.com./models/qwen2-5-coder-32b-instruct) | | Qwen2.5-32B-Instruct | Qwen/Qwen2.5-32B-Instruct | 0.18 | 0.18 | | [View](https://www.siliconflow.com./models/qwen2-5-32b-instruct) | | Qwen2.5-72B-Instruct-128K | Qwen/Qwen2.5-72B-Instruct-128K | 0.59 | 0.59 | | [View](https://www.siliconflow.com./models/qwen2-5-72b-instruct-128k) | | Qwen2.5-72B-Instruct | Qwen/Qwen2.5-72B-Instruct | 0.59 | 0.59 | | [View](https://www.siliconflow.com./models/qwen2-5-72b-instruct) | | Qwen3-Coder-30B-A3B-Instruct | Qwen/Qwen3-Coder-30B-A3B-Instruct | 0.07 | 0.28 | | [View](https://www.siliconflow.com./models/qwen3-coder-30b-a3b-instruct) | | Qwen2.5-7B-Instruct | Qwen/Qwen2.5-7B-Instruct | 0.05 | 0.05 | | [View](https://www.siliconflow.com./models/qwen2-5-7b-instruct) | | Qwen/Qwen3-235B-A22B | Qwen/Qwen3-235B-A22B | 0.35 | 1.42 | | | | Qwen2.5-VL-72B-Instruct | Qwen/Qwen2.5-VL-72B-Instruct | 0.59 | 0.59 | | [View](https://www.siliconflow.com./models/qwen2-5-vl-72b-instruct) | | QwQ-32B | Qwen/QwQ-32B | 0.15 | 0.58 | | [View](https://www.siliconflow.com./models/qwq-32b) | | Qwen2.5-VL-7B-Instruct | Qwen/Qwen2.5-VL-7B-Instruct | 0.05 | 0.05 | | [View](https://www.siliconflow.com./models/qwen2-5-vl-7b-instruct) | | Qwen3-32B | Qwen/Qwen3-32B | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/qwen3-32b) | | Qwen3-VL-8B-Instruct | Qwen/Qwen3-VL-8B-Instruct | 0.18 | 0.68 | | [View](https://www.siliconflow.com./models/qwen3-vl-8b-instruct) | | Qwen3-VL-235B-A22B-Instruct | Qwen/Qwen3-VL-235B-A22B-Instruct | 0.30 | 1.50 | | [View](https://www.siliconflow.com./models/qwen3-vl-235b-a22b-instruct) | | Qwen3-Coder-480B-A35B-Instruct | Qwen/Qwen3-Coder-480B-A35B-Instruct | 0.25 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-coder-480b-a35b-instruct) | | Qwen3-VL-235B-A22B-Thinking | Qwen/Qwen3-VL-235B-A22B-Thinking | 0.45 | 3.50 | | [View](https://www.siliconflow.com./models/qwen3-vl-235b-a22b-thinking) | | Qwen3-30B-A3B-Instruct-2507 | Qwen/Qwen3-30B-A3B-Instruct-2507 | 0.09 | 0.30 | | [View](https://www.siliconflow.com./models/qwen3-30b-a3b-instruct-2507) | | Qwen3-VL-30B-A3B-Thinking | Qwen/Qwen3-VL-30B-A3B-Thinking | 0.29 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-vl-30b-a3b-thinking) | | Qwen3-VL-32B-Thinking | Qwen/Qwen3-VL-32B-Thinking | 0.20 | 1.50 | | [View](https://www.siliconflow.com./models/qwen3-vl-32b-thinking) | | Qwen3-235B-A22B-Thinking-2507 | Qwen/Qwen3-235B-A22B-Thinking-2507 | 0.13 | 0.60 | | [View](https://www.siliconflow.com./models/qwen3-235b-a22b-thinking-2507) | | Qwen3-Omni-30B-A3B-Thinking | Qwen/Qwen3-Omni-30B-A3B-Thinking | 0.10 | 0.40 | | [View](https://www.siliconflow.com./models/qwen3-omni-30b-a3b-thinking) | | Qwen3-VL-32B-Instruct | Qwen/Qwen3-VL-32B-Instruct | 0.20 | 0.60 | | [View](https://www.siliconflow.com./models/qwen3-vl-32b-instruct) | | Qwen3-Next-80B-A3B-Instruct | Qwen/Qwen3-Next-80B-A3B-Instruct | 0.14 | 1.40 | | [View](https://www.siliconflow.com./models/qwen3-next-80b-a3b-instruct) | | Qwen2.5-14B-Instruct | Qwen/Qwen2.5-14B-Instruct | 0.10 | 0.10 | | [View](https://www.siliconflow.com./models/qwen2-5-14b-instruct) | | Qwen3-Next-80B-A3B-Thinking | Qwen/Qwen3-Next-80B-A3B-Thinking | 0.14 | 0.57 | | [View](https://www.siliconflow.com./models/qwen3-next-80b-a3b-thinking) | | GLM-4.6V | zai-org/GLM-4.6V | 0.30 | 0.90 | | [View](https://www.siliconflow.com./models/glm-4-6v) | | zai-org/GLM-4.5 | zai-org/GLM-4.5 | 0.40 | 2.00 | | | | GLM-4.7 | zai-org/GLM-4.7 | 0.42 | 2.20 | | [View](https://www.siliconflow.com./models/glm-4-7) | | GLM-4.6 | zai-org/GLM-4.6 | 0.39 | 1.90 | | [View](https://www.siliconflow.com./models/glm-4-6) | | GLM-4.5V | zai-org/GLM-4.5V | 0.14 | 0.86 | | [View](https://www.siliconflow.com./models/glm-4-5v) | | GLM-4.5-Air | zai-org/GLM-4.5-Air | 0.14 | 0.86 | | [View](https://www.siliconflow.com./models/glm-4-5-air) | | DeepSeek-R1 | deepseek-ai/DeepSeek-R1 | 0.50 | 2.18 | | [View](https://www.siliconflow.com./models/deepseek-r1) | | DeepSeek-R1-Distill-Qwen-32B | deepseek-ai/DeepSeek-R1-Distill-Qwen-32B | 0.18 | 0.18 | | [View](https://www.siliconflow.com./models/deepseek-r1-distill-qwen-32b) | | deepseek-vl2 | deepseek-ai/deepseek-vl2 | | | | | | DeepSeek-R1-Distill-Qwen-14B | deepseek-ai/DeepSeek-R1-Distill-Qwen-14B | 0.10 | 0.10 | | [View](https://www.siliconflow.com./models/deepseek-r1-distill-qwen-14b) | | DeepSeek-V3.2-Exp | deepseek-ai/DeepSeek-V3.2-Exp | 0.27 | 0.41 | | [View](https://www.siliconflow.com./models/deepseek-v3-2-exp) | | DeepSeek-V3.1-Terminus | deepseek-ai/DeepSeek-V3.1-Terminus | 0.27 | 1.00 | | [View](https://www.siliconflow.com./models/deepseek-v3-1-terminus) | | DeepSeek-V3.2 | deepseek-ai/DeepSeek-V3.2 | 0.27 | 0.42 | | [View](https://www.siliconflow.com./models/deepseek-v3-2) | | DeepSeek-V3 | deepseek-ai/DeepSeek-V3 | 0.25 | 1.00 | | [View](https://www.siliconflow.com./models/deepseek-v3) | | DeepSeek-V3.1 | deepseek-ai/DeepSeek-V3.1 | 0.27 | 1.00 | | [View](https://www.siliconflow.com./models/deepseek-v3-1) | | MiniMax-M2.1 | MiniMaxAI/MiniMax-M2.1 | 0.29 | 1.20 | | [View](https://www.siliconflow.com./models/minimax-m2-1) | | Z-Image-Turbo | Tongyi-MAI/Z-Image-Turbo | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/z-image-turbo) | | FLUX.2-flex | black-forest-labs/FLUX.2-flex | | | | | | FLUX.2-pro | black-forest-labs/FLUX.2-pro | | | | | | Qwen-Image | Qwen/Qwen-Image | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen-image) | | Qwen-Image-Edit | Qwen/Qwen-Image-Edit | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen-image-edit) | | Qwen3-Reranker-8B | Qwen/Qwen3-Reranker-8B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-reranker-8b) | | Qwen3-Embedding-8B | Qwen/Qwen3-Embedding-8B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-embedding-8b) | | Qwen3-Reranker-4B | Qwen/Qwen3-Reranker-4B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-reranker-4b) | | Qwen3-Embedding-4B | Qwen/Qwen3-Embedding-4B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-embedding-4b) | | Qwen3-Reranker-0.6B | Qwen/Qwen3-Reranker-0.6B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-reranker-0-6b) | | Qwen3-Embedding-0.6B | Qwen/Qwen3-Embedding-0.6B | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/qwen3-embedding-0-6b) | | FLUX.1-Kontext-pro | black-forest-labs/FLUX.1-Kontext-pro | | | | | | FLUX.1-Kontext-max | black-forest-labs/FLUX.1-Kontext-max | | | | | | FLUX-1.1-pro-Ultra | black-forest-labs/FLUX-1.1-pro-Ultra | | | | | | FLUX-1.1-pro | black-forest-labs/FLUX-1.1-pro | | | | | | FLUX.1-Kontext-dev | black-forest-labs/FLUX.1-Kontext-dev | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/flux-1-kontext-dev) | | FLUX.1-schnell | black-forest-labs/FLUX.1-schnell | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/flux-1-schnell) | | FLUX.1-dev | black-forest-labs/FLUX.1-dev | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/flux-1-dev) | | GLM-4.5 | z-ai/GLM-4.5 | | | | | | GLM-4.5-Air | z-ai/GLM-4.5-Air | 0.14 | 0.86 | | [View](https://www.siliconflow.com./models/glm-4-5-air) | | Kimi-K2.5 | moonshotai/Kimi-K2.5 | 0.55 | 3.00 | | [View](https://www.siliconflow.com./models/kimi-k2-5) | | Step-3.5-Flash | stepfun-ai/Step-3.5-Flash | 0.10 | 0.30 | | [View](https://www.siliconflow.com./models/step-3-5-flash) | | GLM-5 | zai-org/GLM-5 | 0.75 | 2.55 | | [View](https://www.siliconflow.com./models/glm-5) | | MiniMax-M2.5 | MiniMaxAI/MiniMax-M2.5 | 0.20 | 1.00 | | [View](https://www.siliconflow.com./models/minimax-m2-5) | --- [← Back to all providers](/llm.txt)