News & Analysis

Bedrock: Amazon’s Answer to AI 

If Microsoft is in there and Google has followed suit, can Amazon be far behind? The answer is no though the company is taking a different approach to its AI plugin

The artificial intelligence (AI) space is suddenly getting crowded. Microsoft powered its new search engine with ChatGPT while giving its productivity suite an edge as well. Google followed suit almost immediately. Now Amazon is throwing its hat into the ring, but doing it differently by recruiting third parties to create AI models and host them AWS. 

Towards this end, AWS unveiled Amazon Bedrock that allows developers to build generative AI-powered apps via pre-trained models from startups that includes AI21 Labs, Anthropic and Stability AI. For now, it is available for limited preview. The move isn’t surprising, given that AWS has been signing up with generative AI startups in recent times. 

At best we can say that AWS has formalized the relationships, given that since last year several generative AI companies chose it as their preferred cloud provider. There’s Stability AI, and Hugging Face to name a few. More recently, AWS itself launched a generative AI accelerator for startups and partnered with Nvidia to build next-gen infrastructure for training AI models. 

Will it be the bedrock for Amazon’s AI market?

Market research indicates that Bedrock could well be the base that Amazon was looking for in the generative AI market. Grand View Research estimates that this could be worth close to $110 billion within the next seven years as now AWS customers can tap into AI models from different providers via APIs. 

Of course, Amazon has just given us a flavor of things to come as there was no formal pricing though they did suggest that Bedrock targeted large customers building enterprise-scale AI apps. So, does it mean that they are taking a different approach for a different audience compared to what Azure and Google Cloud are taking? It appears so, for now! 

Information remains scarce on the actual model AWS is putting in place or has already done so. Are the AI model vendors incentivized by the reach that AWS offers or is there a potential revenue sharing model available for getting on Bedrock? There’s nothing much that we know about licensing terms or hosting agreements as yet. 

What exactly is Bedrock?

In a blog post, Amazon says “Bedrock is the easiest way for customers to build and scale generative AI-based applications using FMs, democratizing access for all builders. Bedrock will offer the ability to access a range of powerful FMs for text and images—including Amazon’s Titan FMs, which consist of two new LLMs we’re also announcing today—through a scalable, reliable, and secure AWS managed service. 

“With Bedrock’s serverless experience, customers can easily find the right model for what they’re trying to get done, get started quickly, privately customize FMs with their own data, and easily integrate and deploy them into their applications using the AWS tools and capabilities they are familiar with (including integrations with Amazon SageMaker ML features like Experiments to test different models and Pipelines to manage their FMs at scale) without having to manage any infrastructure.”

Amazon has been sketchy with the details

However, given that neither the blog post nor the press announcements provide answers to several legal questions, we would have to wait and see what the customers see in the offering and how many of them would actually bite the bait. 

In the case of Microsoft, the company has seen some success with its generative AI model suite called the Azure OpenAI Service. It bundles OpenAI models with additional features that are targeted for enterprise customers. At last count, the company claimed 1000 customers were using the service. 

What we also do not know is whether the service offered by Bedrock would in any ways open itself to the lawsuits that have been thrown at generative AI tech where plaintiffs claim that copyright data around art was being used without requisite permissions to train the generative models and whether such models that do not attribute or give credit can be commercialized. 

This isn’t the first in Amazon’s AI push

From Amazon’s point of view, the bespoke offerings come from the Titan FM family that has two models now and more to come in the future. There’s a text-generating model and an embedding model. The former works like ChatGPT-4 performing similar tasks and the latter translates text inputs like words and phrases into numerical representations containing semantic meanings. 

So, customers who land on Bedrock via AWS can customize any model by merely pointing the service at a few labeled examples in the Amazon S3 cloud storage plan. One thing that Amazon has reiterated is that there would be no set of specific customer data for conducting generative-AI training the underlying models. 

At a time when generative-AI is itself under a cloud (no pun intended), Amazon is saying that they’re committed to the responsible use of these technologies and would continue to monitor the regulatory landscape. They are also confident that a huge legal team would help them identify what data can be used and what couldn’t. 

On a side note, Amazon also made CodeWhisperer, its AI-powered code-gathering service, free of charge for developers without any usage restrictions. Does it mean the service didn’t get the response they hoped for as compared to GitHub’s Copilot? The service was launched last June as part of the AWS IDE Toolkit. It was trained on billions of lines of open source code and Amazon’s own codebase, as well as documentation and code on public forums. 

Leave a Response