# Abacus ## Provider Information - **Website**: - **Available Models**: 51 ## Models | Name | Original Name | $ Input Price (per 1M) | $ Output Price (per 1M) | Free | Link | |------|---------------|---------------------|----------------------|------|------| | GPT-4.1 Nano | gpt-4.1-nano | 0.10 | 0.40 | | | | Grok 4 Fast (Non-Reasoning) | grok-4-fast-non-reasoning | 0.20 | 0.50 | | | | Gemini 2.0 Flash | gemini-2.0-flash-001 | 0.10 | 0.40 | | | | Gemini 3 Flash Preview | gemini-3-flash-preview | 0.50 | 3.00 | | | | Grok Code Fast 1 | grok-code-fast-1 | 0.20 | 1.50 | | | | Kimi K2 Turbo Preview | kimi-k2-turbo-preview | 0.15 | 8.00 | | | | Gemini 3 Pro Preview | gemini-3-pro-preview | 2.00 | 12.00 | | | | Gemini 2.5 Flash | gemini-2.5-flash | 0.30 | 2.50 | | | | GPT-4.1 Mini | gpt-4.1-mini | 0.40 | 1.60 | | | | Claude Opus 4.5 | claude-opus-4-5-20251101 | 5.00 | 25.00 | | | | Claude Sonnet 4.5 | claude-sonnet-4-5-20250929 | 3.00 | 15.00 | | | | Grok 4 | grok-4-0709 | 3.00 | 15.00 | | | | o3-mini | o3-mini | 1.10 | 4.40 | | | | GPT-5.2 Chat Latest | gpt-5.2-chat-latest | 1.50 | 12.00 | | | | GPT-5.1 | gpt-5.1 | 1.25 | 10.00 | | | | GPT-5 Nano | gpt-5-nano | 0.05 | 0.40 | | | | Claude Sonnet 4 | claude-sonnet-4-20250514 | 3.00 | 15.00 | | | | GPT-4.1 | gpt-4.1 | 2.00 | 8.00 | | | | o4-mini | o4-mini | 1.10 | 4.40 | | | | Claude Opus 4 | claude-opus-4-20250514 | 15.00 | 75.00 | | | | GPT-5 Mini | gpt-5-mini | 0.25 | 2.00 | | | | o3-pro | o3-pro | 20.00 | 80.00 | | | | Claude Sonnet 3.7 | claude-3-7-sonnet-20250219 | 3.00 | 15.00 | | | | Gemini 2.5 Pro | gemini-2.5-pro | 1.25 | 10.00 | | | | GPT-4o (2024-11-20) | gpt-4o-2024-11-20 | 2.50 | 10.00 | | | | o3 | o3 | 2.00 | 8.00 | | | | GPT-4o Mini | gpt-4o-mini | 0.15 | 0.60 | | | | Qwen3 Max | qwen3-max | 1.20 | 6.00 | | | | GPT-5 | gpt-5 | 1.25 | 10.00 | | | | Grok 4.1 Fast (Non-Reasoning) | grok-4-1-fast-non-reasoning | 0.20 | 0.50 | | | | Llama 3.3 70B Versatile | llama-3.3-70b-versatile | 0.59 | 0.79 | | | | Claude Opus 4.1 | claude-opus-4-1-20250805 | 15.00 | 75.00 | | | | GPT-5.2 | gpt-5.2 | 1.75 | 14.00 | | | | GPT-5.1 Chat Latest | gpt-5.1-chat-latest | 1.25 | 10.00 | | | | Claude Haiku 4.5 | claude-haiku-4-5-20251001 | 1.00 | 5.00 | | | | DeepSeek V3.1 | deepseek/deepseek-v3.1 | 0.14 | 0.28 | | | | GPT-OSS 120B | openai/gpt-oss-120b | 0.08 | 0.44 | | | | Llama 3.1 8B Instruct | meta-llama/Meta-Llama-3.1-8B-Instruct | 0.02 | 0.05 | | | | Llama 4 Maverick 17B 128E Instruct FP8 | meta-llama/Llama-4-Maverick-17B-128E-Instruct-FP8 | 0.14 | 0.59 | | | | Llama 3.1 70B Instruct | meta-llama/Meta-Llama-3.1-70B-Instruct | 0.40 | 0.40 | | | | Qwen3 235B A22B Instruct | Qwen/Qwen3-235B-A22B-Instruct-2507 | 0.13 | 0.60 | | | | Qwen 2.5 72B Instruct | Qwen/Qwen2.5-72B-Instruct | 0.11 | 0.38 | | | | QwQ 32B | Qwen/QwQ-32B | 0.40 | 0.40 | | | | Qwen3 32B | Qwen/Qwen3-32B | 0.09 | 0.29 | | | | Qwen3 Coder 480B A35B Instruct | Qwen/qwen3-coder-480b-a35b-instruct | 0.29 | 1.20 | | | | GLM-4.7 | zai-org/glm-4.7 | 0.70 | 2.50 | | | | GLM-4.5 | zai-org/glm-4.5 | 0.60 | 2.20 | | | | GLM-4.6 | zai-org/glm-4.6 | 0.60 | 2.20 | | | | DeepSeek R1 | deepseek-ai/DeepSeek-R1 | 3.00 | 7.00 | | | | DeepSeek V3.1 Terminus | deepseek-ai/DeepSeek-V3.1-Terminus | 0.27 | 1.00 | | | | DeepSeek V3.2 | deepseek-ai/DeepSeek-V3.2 | 0.27 | 0.40 | | | --- [← Back to all providers](/llm.txt)