# Phi-4-Mini ## Model Information - **Organization**: [Nvidia](/llm.txt) - **Slug**: phi-4-mini-instruct - **Available at Providers**: 5 ## Providers | Provider | Name | $ Input (per 1M) | $ Output (per 1M) | Free | Link | |----------|------|-----------------|------------------|------|------| | [Nvidia](/llm/nvidia.txt) | phi-4-mini-instruct | 0.00 | 0.00 | Yes | [View](https://build.nvidia.com/microsoft/phi-4-mini-instruct) | | [GitHub Models](/llm/githubmodels.txt) | Phi-4-mini-instruct | 0.00 | 0.00 | Yes | | | [Weights & Biases](/llm/wandb.txt) | Phi-4-mini-instruct | 0.08 | 0.35 | | | | [Nano-GPT](/llm/nanogpt.txt) | Phi 4 Mini | | | | | | [Yupp](/llm/yupp.txt) | Phi 4 Mini Instruct (Azure) | | | | | --- [← Back to all providers](/llm.txt)