# GLM-5 GLM-5 is Z.ai’s flagship open-source foundation model engineered for complex systems design and long-horizon agent workflows. Built for expert developers, it delivers production-grade performance on large-scale programming tasks, rivaling leading closed-source models. With advanced agentic planning, deep backend reasoning, and iterative self-correction, GLM-5 moves beyond code generation to full-system construction and autonomous execution. ## Model Information - **Organization**: [Z.ai](/llm.txt) - **Slug**: glm-5 - **Available at Providers**: 50 - **Release Date**: February 11, 2026 ### Benchmark Scores - Weekly: 1.22 - SWE Bench: 0.778 - Browsecomp: 0.759 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [RedPill](/llm/redpill.txt) | Z.AI: GLM 5 | 1.20 | 3.50 | | | | [Z.AI](/llm/zai.txt) | GLM-5 | 1.00 | 0.20 | | | | [Kilo Code](/llm/kilocode.txt) | Z.ai: GLM 5 | 0.80 | 2.56 | | [View](https://kilo.ai/models/z-ai/glm-5) | | [Nano-GPT](/llm/nanogpt.txt) | GLM 5 Original | | | | | | [Novita AI](/llm/novita.txt) | glm-5 | 1.00 | 3.20 | | | | [OpenRouter](/llm/openrouter.txt) | GLM 5 | 0.80 | 2.56 | | [View](https://openrouter.ai/z-ai/glm-5-20260211) | | [Requesty](/llm/requesty.txt) | | 1.00 | 3.20 | | | | [Vercel AI Gateway](/llm/vercel.txt) | GLM 5 | 1.00 | 3.20 | | | | [Yupp](/llm/yupp.txt) | GLM 5 (Z.ai) | | | | | | [Zhipu AI](/llm/zhipuai.txt) | GLM-5 | 1.00 | 3.20 | | | | [Arena AI](/llm/arenaai.txt) | | | | | | | [DeepInfra](/llm/deepinfra.txt) | GLM-5 | 0.80 | 2.56 | | | | [Kilo Code](/llm/kilocode.txt) | Z.ai: GLM 5 (free) | 0.00 | 0.00 | Yes | [View](https://kilo.ai/models/z-ai/glm-5) | | [Hugging Face](/llm/huggingface.txt) | GLM-5 | 1.00 | 3.20 | | | | [Friendli](/llm/friendli.txt) | GLM 5 | 1.00 | 3.20 | | | | [Nano-GPT](/llm/nanogpt.txt) | GLM 5 | | | | | | [Nano-GPT](/llm/nanogpt.txt) | GLM 5 Thinking | | | | | | [Nano-GPT](/llm/nanogpt.txt) | GLM 5 TEE | | | | | | [Ollama Cloud](/llm/ollama.txt) | glm-5 | | | | [View](https://ollama.com/library/glm-5) | | [302.AI](/llm/302ai.txt) | glm-5 | 0.60 | 2.60 | | [View](https://302ai-en.apifox.cn/207705115e0) | | [Yupp](/llm/yupp.txt) | GLM 5 (Novita) | | | | | | [Yupp](/llm/yupp.txt) | GLM 5 (OpenRouter) | | | | | | [ZenMUX](/llm/zenmux.txt) | Z.AI: GLM 5 | 0.58 | 2.60 | | | | [Modal](/llm/modal.txt) | GLM-5 | 0.00 | 0.00 | Yes | [View](https://modal.com/glm-5-endpoint) | | [LLM Stats](/llm/llmstats.txt) | GLM-5 | | | | | | [Parasail](/llm/parasail.txt) | Glm 5 | 1.00 | 3.20 | | [View](https://www.saas.parasail.io/pricing) | | [Roo Code](/llm/roocode.txt) | GLM 5 | 1.00 | 3.20 | | [View](https://roocode.com) | | [Venice](/llm/venice.txt) | GLM 5 | 1.00 | 3.20 | | | | [Poe](/llm/poe.txt) | GLM-5 | | | | [View](https://poe.com/glm-5/api) | | [SiliconFlow](/llm/siliconflow.txt) | GLM-5 | 1.00 | 3.20 | | [View](https://www.siliconflow.com./models/glm-5) | | [AIHubMix](/llm/aihubmix.txt) | glm-5 | 0.88 | 2.82 | | [View](https://aihubmix.com/model/glm-5) | | [AIHubMix](/llm/aihubmix.txt) | coding-glm-5-free | 0.00 | 0.00 | Yes | [View](https://aihubmix.com/model/coding-glm-5-free) | | [AIHubMix](/llm/aihubmix.txt) | cc-glm-5 | 0.06 | 0.22 | | [View](https://aihubmix.com/model/cc-glm-5) | | [AIHubMix](/llm/aihubmix.txt) | coding-glm-5 | 0.06 | 0.22 | | [View](https://aihubmix.com/model/coding-glm-5) | | [Routeway](/llm/routeway.txt) | Z.ai: GLM 5 | 0.88 | 2.82 | | [View](https://routeway.ai/models) | | [Okara](/llm/okara.txt) | GLM 5 | | | | [View](https://okara.ai/ai-models/glm-5) | | [CommonStack](/llm/commonstack.txt) | GLM-5 | 1.00 | 3.20 | | | | [RouterLink](/llm/routerlink.txt) | GLM 5 | | | | | | [AIMLAPI](/llm/aimlapi.txt) | GLM 5 | 1.30 | 4.16 | | | | [NetMind](/llm/netmind.txt) | GLM-5 | 1.00 | 3.20 | | | | [Glama](/llm/glama.txt) | glm-5 | 131.00 | 1.00 | | [View](https://glama.ai/gateway/models/glm-5) | | [Together AI](/llm/togetherai.txt) | GLM-5-FP4 | 1.00 | 3.20 | | | | [302.AI](/llm/302ai.txt) | Pro/zai-org/GLM-5 | 0.57 | 2.58 | | [View](https://302ai-en.apifox.cn/api-252564719) | | [Blackbox AI](/llm/blackboxai.txt) | blackboxai/z-ai/glm-5 | | | | | | [Yupp](/llm/yupp.txt) | GLM 5 (Together AI) | | | | | | [CometAPI](/llm/cometapi.txt) | GLM 5 | 0.67 | 2.69 | | | | [SiliconFlow (China)](/llm/siliconflowcn.txt) | Pro/zai-org/GLM-5 | 1.00 | 3.20 | | | | [Google Vertex AI](/llm/googlevertex.txt) | Glm 5 | | | | | | [Nvidia](/llm/nvidia.txt) | glm5 | | | | [View](https://build.nvidia.com/z-ai/glm5) | | [MegaNova](/llm/meganova.txt) | GLM-5 | 0.80 | 2.56 | | [View](https://console.meganova.ai/serverless/zai-org/GLM-5) | --- [← Back to all providers](/llm.txt)