The team at Amazon today announced Amazon Bedrock, a service intended to help organizations build and scale generative AI applications. With this release, users gain access to foundation models (FM) from AI startup model providers such as AI21, Anthropic, and Sustainability AI.
Amazon Bedrock opens up several FMs from different providers so that AWS customers have the flexibility to choose which model would work best for their specific needs.
This release helps users speed up the development of generative AI applications using FMs through an API, without the need to manage infrastructure. These FMs can also be privately customized using data from the user’s own organization.
Amazon Bedrock also allows customers to use AWS tools and features that they are already familiar with in order to deploy scalable and secure generative AI applications.
Additionally, AWS announced the general availability of Amazon EC2 Inf2 instances. This release is powered by AWS Inferentia2 chips, which is intended to lower the overall cost of running generative AI workloads.
According to the company, this release also increases energy efficiency, which helps to make generative AI technology more accessible to a wider range of customers.
Next, the new Trn1n instances run on AWS’s custom Trainium chips, and provide users with enhanced networking capabilities. This helps organizations as they work to train their generative AI models in a quick and inexpensive way.
Lastly, AWS is offering individual developers free access to Amazon CodeWhisper in order to provide them with real-time coding assistance.
Amazon CodeWhisper, uses generative AI under the hood so that it can provide users with code suggestions based on their comments and prior code in real-time.
To learn more, read the blog post.