Model Information
| Slug | jamba-1-5-large |
|---|---|
| LLM.txt | View |
| Release Date | August 22, 2024 |
| GPQA | 0.369 |
Organization
| Name | AI21 |
|---|---|
| Website | https://www.ai21.com/ |
Model Description
Jamba 1.5 Large is part of AI21's new family of open models, offering superior speed, efficiency, and quality.
It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.
Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.
Read their [announcement](https://www.ai21.com/blog/announcing-jamba-model-family) to learn more.
It features a 256K effective context window, the longest among open models, enabling improved performance on tasks like document summarization and analysis.
Built on a novel SSM-Transformer architecture, it outperforms larger models like Llama 3.1 70B on benchmarks while maintaining resource efficiency.
Read their [announcement](https://www.ai21.com/blog/announcing-jamba-model-family) to learn more.
Available at 1 Provider
| Provider | Type | Model Name | Original Model | Input ($/1M) | Output ($/1M) | Free | Actions | |
|---|---|---|---|---|---|---|---|---|
|
|
Writingmate |
Chat
Code
|
AI21: Jamba 1.5 Large
|
ai21/jamba-1-5-large
|
- | - |