AI21 Jamba-Instruct launches on Azure AI Models-as-a-Service (2024)

In collaboration with AI21, Microsoft is excited to announce AI21’s Jamba-Instruct foundation chat completion model available now as a serverless API within Azure AI’s Models-as-a-Service. This marks a new collaboration between AI21 and Microsoft, and the first time Jamba-Instruct is widely available through a cloud partner.

We are thrilled to collaborate with Microsoft to introduce Jamba-Instruct to Microsoft Azure customers. This collaboration underscores our shared commitment to empowering enterprises with innovative AI solutions,” said Pankaj Dugar, SVP, GM North America of AI21. “Microsoft's dedication to putting customers first and making cutting-edge technology accessible aligns perfectly with our mission at AI21. With Jamba-Instruct available on Azure AI, customers now have access to a transformative tool that will revolutionize how they approach language processing and drive unprecedented levels of efficiency and insight.”

AI21 Jamba-Instruct launches on Azure AI Models-as-a-Service (1)

Azure AI provides a diverse array of advanced and user-friendly models, enabling customers to select the model that best fits their use case. Over the past eight months, we've expanded our model catalogthrough our partnerships with leading generative AI model providers to release their models and with Microsoft Research forreleasing Phi-3. Azure AI model catalog lets developers select from over 1,600+ foundational models, including LLMs, SLMs, chat completion, and multimodal models from industry leaders like AI21, Cohere, Databricks, Deci AI, Hugging Face, Microsoft Research, Mistral AI, NVIDIA, OpenAI, and Stability AI. This extensive selection ensures that Azure customers can find the most suitable model for their unique use case.

The Jamba-Instruct model

In offering Jamba-Instruct as part of the Models as a Service (MaaS) offering, both AI21 and Microsoft affirm their commitment to equipping enterprise developers with increased choice in models to build and scale generative AI applications through popular LLM developer tools like Azure AI prompt flow, Azure AI Common API Inference,LangChain, and AI21's Azure client.

Based on information from AI21, Jamba-Instruct is their most recent and powerful model to date. With the production-grade Mamba-based model, Jamba-Instruct leverages its hybrid architecture to achieve great performance, quality, and cost efficiency. In parallel, Azure AI offers a safe and compliant platform for setting up and running AI solutions, and a wide range of tools and services for creating, refining, and testing AI models. Additionally, Azure AI Content Safety offers filters that are enabled to screen for harmful content generated by the model, helping developers build safe and trustworthy applications.

Features of Jamba-Instruct

As outlined by AI21, Jamba-Instruct stands out for the following:

  • Unprecedented context window length: Offering a 70K context window, Jamba-Instruct is built to effectively handle long context instances. From streamlining document comprehension to enabling more robust and sophisticated RAG Engine mechanisms, Jamba-Instruct’s context window opens new opportunities for powerful GenAI workflows.
  • Superior performance on long context use cases: According to AI21, Jamba-Instruct outperforms across long context use case benchmarks, including QA (question answering) on earnings call transcripts and extracting key insights from lengthy legal documents. To view information on results for long-context QA benchmarks, on AI21’s Jamba base model, check out the whitepaper.
  • Cost-efficient processing: Due to its unique hybrid architecture, Jamba-Instruct handles lengthy context on a smaller cloud footprint, according to AI21.
  • Value for cost: According to AI21, Jamba-Instruct is highly competitive in common quality benchmarks, and its pricing makes it an intuitive choice for enterprises looking to build and scale GenAI applications while optimizing for cost. Simply put, Jamba-Instruct delivers high value for its cost among its size class.

Accelerating time to production

By utilizing the instruction-tuned Jamba-Instruct on Azure’s platform, organizations can harness the full potential of AI while having safe, reliable, and secure use. As an instruction-tuned model, Jamba-Instruct comes with built-in safety instructions, chat capabilities, and complex command comprehension needed to make it ready for immediate use by enterprises.

Developers using Jamba-Instruct can work seamlessly with tools in Azure AI Studio, such as Azure AI Content Safety, Azure AI Search, and prompt flow to enhance effective AI practices. Here are some main advantages that highlight the smooth integration and strong support system provided by Jamba-Instruct with Azure, Azure AI and Models as a Service:

  • Enhanced Security and Compliance: Azure places a strong emphasis on data privacy and security, adopting Microsoft's comprehensive security protocols to protect customer data. With Jamba-Instruct on Azure AI Studio, enterprises can operate confidently, knowing their data remains within the secure bounds of the Azure cloud, thereby enhancing privacy and operational efficiency.  
  • Content Safety Integration: Customers can integrate Jamba-Instruct models with content safety features available through Azure AI Content Safety, enabling additional responsible AI practices. This integration facilitates the development of safer AI applications, ensuring content generated or processed is monitored for compliance and ethical standards. 
  • Simplified Assessment of LLM flows:Azure AI's prompt flow allows evaluation flows, which help developers to measure how well the outputs of LLMs match the given standards and goals by computing metrics. This feature is useful for workflows created with Jamba-Instruct; it enables a comprehensive assessment using metrics such as groundedness, which gauges the pertinence and accuracy of the model's responses based on the input sources when using a retrieval augmented generation (RAG) pattern.
  • Simplified Deployment and Inference: By deploying AI21 models through MaaS with pay-as-you-go inference APIs, developers can take advantage of the power of Jamba-Instruct without managing underlying infrastructure in their Azure environment. You can view the pricing on Azure Marketplace for Jamba-Instruct based on input and output token consumption.

These features demonstrate Azure's commitment to offering an environment where organizations can harness the full potential of AI technologies like Jamba-Instruct efficiently and responsibly, driving innovation while maintaining high standards of security and compliance.

With Azure AI’s commitment to data security, privacy, and pay-as-you-go inference APIs, developers can focus on building, knowing that their enterprise data remains secure without needing to dedicate additional time to maintaining complex infrastructure.

Getting started with Jamba-Instruct on Azure

We invite you to innovate using Jamba-Instruct on Azure AI Studio. There are a wide variety of use cases that companies are building including automated term sheet generators, product description generators, and virtual health assistants. You can view various other use cases here.

To start building, enter the Azure AI Studio model catalog and utilize the Jamba-Instruct model. To view documentation on getting started, visit this link. Deploying Jamba-Instruct takes a couple of minutes by following these clear steps: 

  • Familiarize Yourself: If you're new to Azure AI Studio, start by reviewing this documentation to understand the basics and set up your first project. 
  • Access the Model Catalog: Open the model catalog in AI Studio. 
  • Find the Model: Use the filter to select the AI21 collection or click the “View models” button on the MaaS announcement card. 
  • Select the Model: Open the AI21-Jamba-Instruct model from the list. 
  • Deploy the Model: Click on ‘Deploy’ and choose the Pay-as-you-go (PAYG) deployment option. 
  • Subscribe and Access: Subscribe to the offer to gain access to the model (usage charges apply), then proceed to deploy it. 
  • Explore the Playground: After deployment, you will automatically be redirected to the Playground. Here, you can explore the model's capabilities. 
  • Customize Settings: Adjust the context or inference parameters to fine-tune the model's predictions to your needs. 
  • Access Programmatically: Click on the “View code” button to obtain the API, keys, and a code snippet. This enables you to access and integrate the model programmatically. 
  • Integrate with Tools: Use the provided API in Large Language Model (LLM) tools such as prompt flow, Semantic Kernel, LangChain, or any other tools that support REST API with key-based authentication for making inferences. 

FAQs

What does it cost to use the Jamba-Instruct model on Azure?

  • You are billed based on the number of prompt and completions tokens. You can review the pricing in the Marketplace offer details tab when deploying the model. You can also find the pricing on the Azure Marketplace.
  • Paygo-inference-input tokens are 1k for 0.007, and paygo-inference-output-tokens are 1k for 0.0005.

Do I need GPU capacity in my Azure subscription to use Jamba-Instruct?

  • No, you do not need GPU capacity. The Jamba-Instruct is offered as an API through Models as a Service.

Is Jamba-Instruct available in Azure Machine Learning Studio?

  • Yes, Jamba-Instruct is available in the model catalog in both Azure AI Studio and Azure Machine Learning Studio.

Jamba-Instruct is listed on the Azure Marketplace. Can I purchase and use Jamba-Instruct directly from Azure Marketplace?

  • Azure Marketplace is our foundation for commercial transactions for models built on or built for Azure. The Azure Marketplace enables the purchasing and billing of Jamba-Instruct. However, model discoverability occurs in both Azure Marketplace and the Azure AI model catalog. Meaning you can search and find Jamba-Instruct in both the Azure Marketplace and Azure AI model catalog.
  • If you search for Jamba-Instruct in Azure Marketplace, you can subscribe to the offer before being redirected to the Azure AI model catalog in Azure AI Studio where you can complete subscribing and can deploy the model.
  • If you search for Jamba-Instruct in the Azure AI model catalog, you can subscribe and deploy the model from the Azure AI model catalog without starting from the Azure Marketplace. The Azure Marketplace still tracks the underlying commerce flow.

Given that Jamba-Instruct is billed through the Azure Marketplace, does it retire my Azure consumption commitment (aka MACC)?

  • Yes, Jamba-Instruct is an “Azure benefit eligible” Marketplace offer, which indicates MACC eligibility. Learn more about MACC here: https://learn.microsoft.com/en-us/marketplace/azure-consumption-commitment-benefit

Is my inference data shared with AI21? What about safety, privacy, and data processing?

  • No, Microsoft does not share the content from prompts or outputs with AI21. Learn more about data use through model catalog here: Data, privacy, and security for use of models through the Model Catalog in Azure AI Studio

Are there rate limits for the Jamba-Instruct model on Azure?

  • Yes, there are rate limits for the Jamba-Instruct model on Azure. Each deployment has a rate limit of 400 tokens per minute and 1,000 API requests per minute. Contact Azure customer support if you have additional questions.

Is the Jamba-Instruct model region specific?

  • Jamba-Instruct model API endpoints can be created in AI Studio projects to Azure Machine Learning workspaces in EastUS2 and Sweden Central. If you want to use the model in prompt flow in project or workspaces in other regions, you can use the API and key as a connection to prompt flow manually. Essentially, you can use the API from any Azure region once you create it in EastUS2 or Sweden Central.

Can I fine-tune the Jamba-Instruct model on Azure?

  • You cannot currently fine-tune the model through Azure AI Studio.

Can I use MaaS models in any Azure subscription types?

  • Customers can use MaaS models in all Azure subsection types with a valid payment method, except for the CSP (Cloud Solution Provider) program. Free or trial Azure subscriptions are not supported.
AI21 Jamba-Instruct launches on Azure AI Models-as-a-Service (2024)

References

Top Articles
Latest Posts
Article information

Author: Tish Haag

Last Updated:

Views: 6213

Rating: 4.7 / 5 (47 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Tish Haag

Birthday: 1999-11-18

Address: 30256 Tara Expressway, Kutchburgh, VT 92892-0078

Phone: +4215847628708

Job: Internal Consulting Engineer

Hobby: Roller skating, Roller skating, Kayaking, Flying, Graffiti, Ghost hunting, scrapbook

Introduction: My name is Tish Haag, I am a excited, delightful, curious, beautiful, agreeable, enchanting, fancy person who loves writing and wants to share my knowledge and understanding with you.