# Ollama Cloud Ollama is a platform that enables users to run large language models locally on their own hardware, providing complete data privacy and security without requiring cloud services or API keys. It offers an OpenAI-compatible RESTful API for easy integration and supports a wide variety of open-source models including Llama, Olmo, and many others. Ollama enables multimodal support for text chat, PDF integration with RAG (Retrieval Augmented Generation), voice chat, and image-based interactions. The platform eliminates API costs, works offline after initial model download, and is ideal for privacy-sensitive work and local prototyping. ## Provider Information - **Website**: - **Available Models**: 16 ## Models | Name | Original Name | $ Input Price (per 1M) | $ Output Price (per 1M) | Free | Link | |------|---------------|---------------------|----------------------|------|------| | glm-4.6 | glm-4.6 | | | | [View](https://ollama.com/library/glm-4.6) | | glm-4.7 | glm-4.7 | | | | [View](https://ollama.com/library/glm-4.7) | | kimi-k2 | kimi-k2:1t | | | | [View](https://ollama.com/library/kimi-k2:1t) | | kimi-k2-thinking | kimi-k2-thinking | | | | [View](https://ollama.com/library/kimi-k2-thinking) | | qwen3-coder | qwen3-coder:480b | | | | [View](https://ollama.com/library/qwen3-coder:480b) | | deepseek-v3.2 | deepseek-v3.2 | | | | [View](https://ollama.com/library/deepseek-v3.2) | | deepseek-v3.1 | deepseek-v3.1:671b | | | | [View](https://ollama.com/library/deepseek-v3.1:671b) | | minimax-m2 | minimax-m2 | | | | [View](https://ollama.com/library/minimax-m2) | | minimax-m2.1 | minimax-m2.1 | | | | [View](https://ollama.com/library/minimax-m2.1) | | mistral-large-3 | mistral-large-3:675b | | | | [View](https://ollama.com/library/mistral-large-3:675b) | | gemini-3-flash-preview | gemini-3-flash-preview | | | | [View](https://ollama.com/library/gemini-3-flash-preview) | | kimi-k2.5 | kimi-k2.5 | | | | [View](https://ollama.com/library/kimi-k2.5) | | devstral-2 | devstral-2:123b | | | | [View](https://ollama.com/library/devstral-2:123b) | | qwen3-coder-next | qwen3-coder-next | | | | [View](https://ollama.com/library/qwen3-coder-next) | | glm-5 | glm-5 | | | | [View](https://ollama.com/library/glm-5) | | minimax-m2.5 | minimax-m2.5 | | | | [View](https://ollama.com/library/minimax-m2.5) | --- [← Back to all providers](/llm.txt)