Cerebras icon

Cerebras

cerebras

Provider Information
Slug cerebras
Updated 3 hours ago
Links
Website https://www.cerebras.ai/

Cerebras Systems is known for their Wafer-Scale Engine (WSE), the largest chip ever built featuring massive parallel processing capabilities. Their CS-2 systems use the WSE for both training and inference, providing high-performance AI compute with specialized LLM inference services. Cerebras offers inference through Cerebras Cloud as well as on-premises hardware solutions, with optimized inference for models like Meta's Llama family. The company focuses on delivering exceptional AI inference performance for large language models, particularly for enterprises requiring high throughput and low latency for production AI deployments.

Models (3)

Organization Model Name Provider Model Input ($/1M) Output ($/1M) Free Weekly
Zhipu AI icon Zhipu AI GLM-4.7 zai-glm-4.7 $0.00 $0.00
OpenAI icon OpenAI GPT OSS 120B gpt-oss-120b $0.25 $0.69
Alibaba Cloud / Qwen Team icon Alibaba Cloud / Qwen Team Qwen3-235B-A22B-Instruct-2507 qwen-3-235b-a22b-instruct-2507 $0.60 $1.20