Amazon.com releases its own Base generative AI solution as a whole supply

Amazon today declared the standard supply of Base, its own solution that uses an option of generative AI styles coming from Amazon.com on its own as well as 3rd party companions by means of an API.

Bedrock, which was unveiled in early April, allows AWS customers to build apps on top of generative AI models and customize them with their proprietary data. Leveraging these models, brands and developers can also create AI “agents” that automatically execute tasks like booking travel, managing inventory and processing insurance claims.

In the coming weeks, Llama 2, the open source large language model model from Meta, will come to Base, Amazon says — joining models from AI21 Labs, Anthropic, Cohere and Stability AI.

Amazon claims Bedrock will be the first “fully managed generative AI service” to offer Llama 2, specifically the 13-billion- and 70-billion-parameter flavors. (Parameters are the parts of a model learned from historical training data and essentially define the skill of the model on a problem, such as generating text.) However, it’s worth noting that Llama 2 has been available on other cloud-hosted generative AI platforms for some time, including Google’s Vertex AI.

Bedrock is in many ways comparable to Vertex AI, speaking of, which offers its own library of fine-tunable first- and third-party models on which clients can develop generative AI applications. But Swami Sivasubramanian, VP of data as well as AI at AWS, argues that Bedrock has an advantage in that it plays nicely along with existing AWS services, like AWS PrivateLink for establishing a secure connection between Bedrock and a company’s virtual private cloud.

To be fair to Google, I’d argue that’s more of a perceived advantage than an objective one, seeing as it’s dependent on the customer in question and the cloud infrastructure they’re using. Of course, you won’t hear Sivasubramanian acknowledge that.

“Over the last year, the proliferation of information, access to scalable compute, as well as advancements in machine learning have led to a surge of interest in generative AI, sparking new ideas that could transform entire industries and reimagine how work gets done,” Sivasubramanian said in a press release. “Today’s announcement is a major milestone that puts generative AI at the fingertips of every business, from startups to enterprises, and every employee, from programmers to data analysts.”

In related news this morning, Amazon announced the rollout of its Titan Embeddings model, a first-party model that converts text to numerical representations called embeddings to power search and personalization applications. The Titan Embeddings model supports around 25 languages and chunks of text — or whole documents — up to 8,192 tokens (equivalent to ~6,000 words) in length, on par with the latest embeddings model from OpenAI.

Bedrock had a rocky start. Bloomberg reported in May that, six weeks after Amazon demoed the tech with an unusually vague presser and just one testimonial, most cloud customers still didn’t have access. With today’s announcements — and its recent, multi-billion-dollar investment in artificial intelligence startup Anthropic — Amazon’s clearly looking to make waves in the growing as well as lucrative market for generative artificial intelligence.