AI21 Labs has announced the availability of its Jamba 1.5 family of large language models in Amazon Bedrock. Both Jamba 1.5 Mini and Jamba 1.5 Large support a 256K token context window, structured JSON output, function calling and can digest document objects.
Jamba 1.5 Large is optimised for complex reasoning tasks across all prompt lengths, while Jamba 1.5 Mini is tailored for low-latency processing of long prompts.
Key features of these models include both holding a 256K token context length and support for multiple languages such as, English, Spanish, French, Portuguese, Italian, Dutch, German, Arabic, and Hebrew.
Performance-wise, Jamba 1.5 achieved up to 2.5x faster inference on long contexts than similarly sized models tested by AI21.
Users can request access to the models via the Amazon Bedrock console and can experiment with them within the chat and text playground.
Potential use cases for the Jamba 1.5 family of models include paired document analysis, compliance analysis, and question answering for long documents.
AI21 Labs is a leader in building foundation models and AI systems for the enterprise, and when combined with the powerful AWS platform, it offers a strategic collaboration so that customers can leverage LLMs in a secure environment.
These cutting-edge models can help businesses process information, communicate and learn, ultimately driving innovation in business, industry and beyond.
Customers can follow the provided code example to access available models using AWS SDKs and build their applications using various programming languages, while AI21's documentation offers further guidance.
The Jamba 1.5 models are now generally available in Amazon Bedrock Console, in the US East (N. Virginia) AWS Region.