InclusionAI

inclusionAI: Ring-1T

InclusionAI ring-1t
Model Information
Slug ring-1t
LLM.txt View
Organization
Model Description
Ring-1T is a trillion-parameter sparse mixture-of-experts (MoE) thinking model developed by inclusionAI. It adopts the Ling 2.0 architecture and is trained on the Ling-1T-base foundation model, which contains 1 trillion total parameters with 50 billion activated parameters, supporting a context window of up to 128K tokens. Building upon the preview version released at the end of September, Ring-1T has undergone continued scaling with large-scale verifiable reward reinforcement learning (RLVR) training, further unlocking the natural language reasoning capabilities of the trillion-parameter foundation model.
Available at 4 Providers
Provider Type Model Name Original Model Input ($/1M) Output ($/1M) Free Actions
AIHubMix
AIHubMix
Ring-1T
inclusionAI/Ring-1T $0.55 $2.19
ZenMUX
ZenMUX
inclusionAI: Ring-1T
inclusionai/ring-1t $0.56 $2.24
Bailing
Bailing
Ring-1T
Ring-1T $0.57 $2.29
Arena AI
Arena AI
Chat
ring-1t - -