# ZenMUX ZenMUX is an enterprise-grade AI model aggregation platform that serves as a unified API router for multiple AI providers. The platform supports 75+ providers with auto failover, intelligent routing capabilities, error compensation mechanisms, and transparent pricing across providers. ZenMUX focuses on eliminating hallucination risks through intelligent routing, offers context-aware routing for different scenarios and long context requirements, and provides provider insurance as a safety net for AI model reliability. The platform is built for enterprise use with focus on stability and quick model deployment, supporting both local models and cloud-based providers. ## Provider Information - **Website**: - **Available Models**: 102 ## Models | Name | Original Name | $ Input Price (per 1M) | $ Output Price (per 1M) | Free | Link | |------|---------------|---------------------|----------------------|------|------| | Z.AI: GLM 4.7 Flash (Free) | z-ai/glm-4.7-flash-free | 0.00 | 0.00 | Yes | | | Z.AI: GLM 4.7 FlashX | z-ai/glm-4.7-flashx | 0.07 | 0.43 | | | | DeepSeek: DeepSeek-V3.2 (Non-thinking Mode) | deepseek/deepseek-chat | 0.28 | 0.42 | | | | DeepSeek: DeepSeek-V3.2 (Thinking Mode) | deepseek/deepseek-reasoner | 0.28 | 0.42 | | | | Xiaomi: MiMo-V2-Flash Free | xiaomi/mimo-v2-flash-free | 0.00 | 0.00 | Yes | | | Xiaomi: MiMo-V2-Flash | xiaomi/mimo-v2-flash | 0.10 | 0.30 | | | | StepFun: Step-3 | stepfun/step-3 | 0.21 | 0.57 | | | | Mistral: Mistral Large 3 | mistralai/mistral-large-2512 | 0.50 | 1.50 | | | | Meta: Llama 4 Scout Instruct | meta/llama-4-scout-17b-16e-instruct | 0.08 | 0.40 | | | | DeepSeek: DeepSeek-V3.2-Exp | deepseek/deepseek-v3.2-exp | 0.22 | 0.33 | | | | Anthropic: Claude Opus 4.5 | anthropic/claude-opus-4.5 | 5.00 | 25.00 | | | | DeepSeek: DeepSeek V3.2 | deepseek/deepseek-v3.2 | 0.28 | 0.43 | | | | Meta: Llama 3.3 70B Instruct | meta/llama-3.3-70b-instruct | 0.60 | 1.20 | | | | xAI: Grok 4.1 Fast | x-ai/grok-4.1-fast | 0.20 | 0.50 | | | | Google: Gemini 3 Pro Image (Nano Banana Pro) | google/gemini-3-pro-image-preview | 2.00 | 12.00 | | | | Google: Gemini 3 Pro Preview | google/gemini-3-pro-preview | 2.00 | 12.00 | | | | Baidu: ERNIE-X1.1-Preview | baidu/ernie-x1.1-preview | 0.14 | 0.56 | | | | Baidu: ERNIE 5.0 | baidu/ernie-5.0-thinking-preview | 0.84 | 3.37 | | | | Qwen: Qwen3 Max Thinking Preview | qwen/qwen3-max-preview | 1.20 | 6.00 | | | | MiniMax: MiniMax M2.1 | minimax/minimax-m2.1 | 0.30 | 1.20 | | | | MiniMax: MiniMax M2 | minimax/minimax-m2 | 0.30 | 1.20 | | | | Z.AI: GLM 4.6V FlashX | z-ai/glm-4.6v-flash | 0.02 | 0.21 | | | | Z.AI: GLM 4.6V | z-ai/glm-4.6v | 0.14 | 0.42 | | | | Z.AI: GLM 4.6 | z-ai/glm-4.6 | 0.35 | 1.54 | | | | Z.AI: GLM 4.7 | z-ai/glm-4.7 | 0.28 | 1.14 | | | | Z.AI: GLM 4.5 | z-ai/glm-4.5 | 0.35 | 1.54 | | | | Z.AI: GLM 4.5 Air | z-ai/glm-4.5-air | 0.11 | 0.56 | | | | Z.AI: GLM 4.6V Flash (Free) | z-ai/glm-4.6v-flash-free | 0.00 | 0.00 | Yes | | | MoonshotAI: Kimi K2 0905 | moonshotai/kimi-k2-0905 | 0.60 | 2.50 | | | | MoonshotAI: Kimi K2 Thinking | moonshotai/kimi-k2-thinking | 0.60 | 2.50 | | | | MoonshotAI: Kimi K2 Thinking Turbo | moonshotai/kimi-k2-thinking-turbo | 1.15 | 8.00 | | | | Qwen: Qwen3-Max-Thinking | qwen/qwen3-max | 1.20 | 6.00 | | | | Qwen: Qwen3-VL-Plus | qwen/qwen3-vl-plus | 0.20 | 1.60 | | | | Qwen: Qwen3-Coder-Plus | qwen/qwen3-coder-plus | 1.00 | 5.00 | | | | xAI: Grok 4 | x-ai/grok-4 | 3.00 | 15.00 | | | | xAI: Grok Code Fast 1 | x-ai/grok-code-fast-1 | 0.20 | 1.50 | | | | xAI: Grok 4 Fast | x-ai/grok-4-fast | 0.20 | 0.50 | | | | xAI: Grok 4 Fast None Reasoning | x-ai/grok-4-fast-non-reasoning | 0.20 | 0.50 | | | | xAI: Grok 4.1 Fast Non Reasoning | x-ai/grok-4.1-fast-non-reasoning | 0.20 | 0.50 | | | | MoonshotAI: Kimi K2 0711 | moonshotai/kimi-k2-0711 | 0.56 | 2.23 | | | | OpenAI: GPT-5 Chat | openai/gpt-5-chat | 1.25 | 10.00 | | | | OpenAI: GPT-4.1 Mini | openai/gpt-4.1-mini | 0.40 | 1.60 | | | | OpenAI: GPT-4o | openai/gpt-4o | 2.50 | 10.00 | | | | OpenAI: GPT-4.1 Nano | openai/gpt-4.1-nano | 0.10 | 0.40 | | | | OpenAI: o4 Mini | openai/o4-mini | 1.10 | 4.40 | | | | OpenAI: GPT-4o-mini | openai/gpt-4o-mini | 0.15 | 0.60 | | | | OpenAI: GPT-4.1 | openai/gpt-4.1 | 2.00 | 8.00 | | | | OpenAI: GPT-5.2 Pro | openai/gpt-5.2-pro | 21.00 | 168.00 | | | | OpenAI: GPT-5.2 | openai/gpt-5.2 | 1.75 | 14.00 | | | | OpenAI: GPT-5 Nano | openai/gpt-5-nano | 0.05 | 0.40 | | | | OpenAI: GPT-5.2 Chat | openai/gpt-5.2-chat | 1.75 | 14.00 | | | | OpenAI: GPT-5 Mini | openai/gpt-5-mini | 0.25 | 2.00 | | | | OpenAI: GPT-5.1 | openai/gpt-5.1 | 1.25 | 10.00 | | | | OpenAI: GPT-5.1 Chat | openai/gpt-5.1-chat | 1.25 | 10.00 | | | | OpenAI: GPT-5.1-Codex-Mini | openai/gpt-5.1-codex-mini | 0.25 | 2.00 | | | | OpenAI: GPT-5.2-Codex | openai/gpt-5.2-codex | 1.75 | 14.00 | | | | OpenAI: GPT-5 | openai/gpt-5 | 1.25 | 10.00 | | | | OpenAI: GPT-5 Pro | openai/gpt-5-pro | 15.00 | 120.00 | | | | OpenAI: GPT-5 Codex | openai/gpt-5-codex | 1.25 | 10.00 | | | | OpenAI: GPT-5.1-Codex | openai/gpt-5.1-codex | 1.25 | 10.00 | | | | Anthropic: Claude Opus 4 | anthropic/claude-opus-4 | 15.00 | 75.00 | | | | Anthropic: Claude Opus 4.1 | anthropic/claude-opus-4.1 | 15.00 | 75.00 | | | | Anthropic: Claude 3.5 Haiku | anthropic/claude-3.5-haiku | 0.80 | 4.00 | | | | Anthropic: Claude 3.7 Sonnet | anthropic/claude-3.7-sonnet | 3.00 | 15.00 | | | | Anthropic: Claude Sonnet 4.5 | anthropic/claude-sonnet-4.5 | 3.00 | 15.00 | | | | Anthropic: Claude Haiku 4.5 | anthropic/claude-haiku-4.5 | 1.00 | 5.00 | | | | Anthropic: Claude Sonnet 4 | anthropic/claude-sonnet-4 | 3.00 | 15.00 | | | | Google: Gemini 3 Flash Preview | google/gemini-3-flash-preview | 0.50 | 3.00 | | | | Google: Gemini 2.0 Flash | google/gemini-2.0-flash | 0.15 | 0.60 | | | | Google: Gemini 2.0 Flash Lite | google/gemini-2.0-flash-lite-001 | 0.08 | 0.30 | | | | Google: Gemini 2.5 Flash | google/gemini-2.5-flash | 0.30 | 2.50 | | | | Google: Gemini 2.5 Flash Lite | google/gemini-2.5-flash-lite | 0.10 | 0.40 | | | | Google: Gemini 2.5 Pro | google/gemini-2.5-pro | 1.25 | 10.00 | | | | Qwen: Qwen3-Coder | qwen/qwen3-coder | 1.25 | 5.01 | | | | inclusionAI: Ring-mini-2.0 | inclusionai/ring-mini-2.0 | 0.07 | 0.70 | | | | DeepSeek: R1 0528 | deepseek/deepseek-r1-0528 | 0.56 | 2.23 | | | | inclusionAI: Ling-mini-2.0 | inclusionai/ling-mini-2.0 | 0.07 | 0.28 | | | | inclusionAI: Ling-flash-2.0 | inclusionai/ling-flash-2.0 | 0.28 | 2.80 | | | | inclusionAI: Ring-flash-2.0 | inclusionai/ring-flash-2.0 | 0.28 | 2.80 | | | | inclusionAI: Ring-1T | inclusionai/ring-1t | 0.56 | 2.24 | | | | inclusionAI: LLaDA2-flash-CAP | inclusionai/llada2.0-flash-cap | 0.28 | 2.85 | | | | inclusionAI: Ling-1T | inclusionai/ling-1t | 0.56 | 2.24 | | | | Qwen: Qwen3 235B A22B Thinking 2507 | qwen/qwen3-235b-a22b-thinking-2507 | 0.28 | 2.78 | | | | DeepSeek: DeepSeek V3.1 | deepseek/deepseek-chat-v3.1 | 0.28 | 1.11 | | | | Qwen: Qwen3 14B | qwen/qwen3-14b | 0.14 | 1.40 | | | | Qwen: Qwen3 235B A22B Instruct 2507 | qwen/qwen3-235b-a22b-2507 | 0.28 | 1.11 | | | | Google: Gemma 3 12B | google/gemma-3-12b-it | 0.02 | 0.10 | | | | MiniMax: MiniMax M2-her | minimax/minimax-m2-her | 0.30 | 1.20 | | | | MoonshotAI: Kimi K2.5 | moonshotai/kimi-k2.5 | 0.58 | 3.02 | | | | StepFun: Step 3.5 Flash | stepfun/step-3.5-flash | 0.10 | 0.30 | | | | StepFun: Step 3.5 Flash (Free) | stepfun/step-3.5-flash-free | 0.00 | 0.00 | Yes | | | Anthropic: Claude Opus 4.6 | anthropic/claude-opus-4.6 | 5.00 | 25.00 | | | | OpenAI: GPT-Image-1.5 | openai/gpt-image-1.5 | 5.00 | 10.00 | | | | Google: Gemini 2.5 Flash Image (Nano Banana) | google/gemini-2.5-flash-image | 0.30 | 2.50 | | | | Z.AI: GLM 5 | z-ai/glm-5 | 0.58 | 2.60 | | | | MiniMax: MiniMax M2.5 | minimax/minimax-m2.5 | 0.30 | 1.20 | | | | Qwen: Qwen3.5-Plus | qwen/qwen3.5-plus | 0.40 | 2.40 | | | | Anthropic: Claude Sonnet 4.6 | anthropic/claude-sonnet-4.6 | 3.00 | 15.00 | | | | Google: Gemini 3.1 Pro Preview | google/gemini-3.1-pro-preview | 2.00 | 12.00 | | | | OpenAI: GPT-5.3-Codex | openai/gpt-5.3-codex | 1.75 | 14.00 | | | | Qwen: Qwen3.5-flash | qwen/qwen3.5-flash | 0.10 | 0.40 | | | | Google: Gemini 3.1 Flash Image Preview | google/gemini-3.1-flash-image-preview | 0.25 | 1.50 | | | --- [← Back to all providers](/llm.txt)