News & Analysis

AWS Offers Testbed for GenAI Models

At least that’s what it appears to be, but only a closer scrutiny will tell what’s the benefit that Amazon gets out of this effort

Trust Amazon to come up with solutions that make your cloud computing easier. Today AWS has set its sights on becoming the preferred platform for enterprises that are seeking to fine-tune their GenAI models. The Custom Model Import (in preview) feature on Bedrock, which is their enterprise-focused suite of GenAI services went live. 

The feature allows companies to import and access their in-house GenAI models as fully managed APIs. Such models, once they get imported, can utilize the same infrastructure as other GenAI models available in the Bedrock library. Just so that you’re aware, this library already has Meta’s Llama 3 and Anthropic’s Claude-3 among others). 

Starting today, Amazon Bedrock adds in preview the capability to import custom weights for supported model architectures (such as Meta Llama 2, Llama 3, and Mistral) and serve the custom model using On-Demand mode. You can import models with weights in Hugging Face safetensors format from Amazon SageMaker and Amazon Simple Storage Service (Amazon S3), says a company statement

There’s more that customers can get

Customers would also get the tools to expand their knowledge-base, fine-tune them and implement safeguards against biases. So, now users can also access Amazon’s Bedrock, with custom models such as Code Llama (a code-specialized version created for further training of Llama 2) as well. Of course, they can also fine-tune their own models.

A report published by TechCrunch quotes Vasi Philomin, VP of GenAI at AWS to suggest that customers on AWS were seeking to fine-tune or build their own models outside of Bedrock using other tools. And the latest effort Amazon is to provide them with the facility as well as the capabilities to do so.  

“This Custom Model Import capability allows them to bring their own proprietary models to Bedrock and see them right next to all of the other models that are already on Bedrock — and use them with all of the workflows that are also already on Bedrock, as well, says Philomin in the conversation with the publication. 

Amazon’s solution is also an advantage for itself

At this point, it looks like Amazon (or AWS) is providing a solution to what has been shared as the biggest challenge of building custom models and refining them. A recent poll by cnvrg.io (owned by Intel) noted that enterprises found infrastructure, especially for cloud computing, as the greatest barrier to deployment of GenAI-based solutions. 

For now, Custom Model Import seems to resolve this challenge and more importantly it ensures that Amazon is a step ahead of its archrivals Azure (Microsoft) and Google Cloud. In the past, Bedrock’s competitor Vertex AI (of Google had let customers upload GenAI models, customize them and launch APIs. 

The first step in Andy Jassy’s three-stage process

The latest effort by AWS appears to fulfil one of the several aspirations that Amazon CEO Andy Jassy had shared via his annual letter to shareholders on April 11. Jassy clearly defined the approach Amazon planned to take to target the GenAI-led business, each of which he described as “gigantic” and into which the company was “deeply investing”. 

Here is what he said in the letter: “The “bottom layer” of Amazon’s AI strategy is to help developers and companies train models and produce predictions. Amazon says having its own custom AI training and inference chips will bring down costs for customers. A “middle layer” serves companies that want to use their own data to customize existing foundational models and gain security and other features to build and scale generative AI applications. The “top layer” is where Amazon builds generative AI applications for its own consumer businesses. For example, there’s “Rufus,” Amazon’s AI-powered shopping assistant, and the Amazon Web Services “Amazon Q.”

With the launch of the Custom Import Model, AWS has fulfilled the first part of this ambitious plan from Jassy. According to Philomin, this innovation allows Bedrock to offer a wider breadth of model customization options than the competitors. In addition, it would be a value-add to the thousands of customers that are already using Bedrock. 

There’s some smart stuff, but requires more

In his words, there are some specific customization options that the new innovation provides. For starters, there’s Guardrails that allows Bedrock users to configure thresholds to filter the output of their models for stuff such as hate speech, violate or personal data. This would work with a Model Evaluation tool that customers can use to test its efficacy. 

Amidst all this good news, there’s also a catch. The AWS innovation currently supports only three model architectures at this juncture, including Hugging Face’s Flan-T5, Meta’s Llama and Mistral. Let’s see how the experiment goes for now. Suffice to say that we will be keeping a close watch on this front.