Amazon.com Inc. today detailed plans to double its investment in Anthropic PBC to $8 billion.
The announcement of the most recent $4 billion money infusion comes a few 12 months after the cloud and retail giant disclosed its first $4 billion commitment to Anthropic. On the time, the OpenAI rival named Amazon Web Services as its primary cloud provider. The deal announced today will see AWS tackle the extra role of Anthropic’s primary AI training provider.
Anthropic introduced its most advanced large language model, Claude 3.5 Sonnet, last month. It’s an improved version of an identically named LLM that debuted just a few months earlier. The brand new Claude 3.5 Sonnet is healthier than its namesake at several tasks, including code generation, and outperformed OpenAI’s GPT-4o across multiple benchmark tests.
Anthropic offers its LLMs through Amazon Bedrock, an AWS service that gives access to managed AI models. The businesses’ expanded partnership will give Bedrock users early access to a feature that makes it possible to fine-tune Claude models using customer-provided datasets. Increasing the quantity of information available to an LLM often improves the standard of its output.
Alongside the go-to-market collaboration, AWS and Anthropic plan to support each other’s product development efforts. Anthropic will use the cloud giant’s AWS Trainium and Inferentia chips, that are optimized for artificial intelligence training and inference, respectively, to power its internal workloads. The OpenAI rival detailed that it is going to leverage the previous processor line to construct its “largest foundation models.”
The most recent Trainium chip, Trainium2, debuted last November. The processor boasts double the performance of its predecessor and twice the facility efficiency. Customers can provision as many as 16 Trainium2 chips per instance, in addition to mix instances into AI clusters with as much as 100,000 chips and 65 exaflops of computing power.
In parallel, Anthropic engineers will support AWS’ efforts to develop latest Trainium processors. The LLM developer will contribute to the software stack that powers the chip lineup. Neutron, because the software stack is named, features a compiler that optimizes customers’ AI models to run on Trainium instances and several other other tools.
Anthropic is working on “low-level kernels that allow us to directly interface with the Trainium silicon,” the corporate detailed in a blog post today. In some AI processors, a kernel is a code snippet that distributes computations across its host chip’s cores as a way of boosting performance. Kernels are among the many constructing blocks of advanced AI models.
“By continuing to deploy Anthropic models in Amazon Bedrock and collaborating with Anthropic on the event of our custom Trainium chips, we’ll keep pushing the boundaries of what customers can achieve with generative AI technologies,” said AWS Chief Executive Officer Matt Garman.
Anthropic’s latest funding round comes two months after OpenAI raised $6.6 billion in the biggest startup investment on record. It also secured a $4 billion line of credit from a gaggle of banks. OpenAI, which is now price $157 billion, will invest the funds in AI research and compute infrastructure.
Photo: AWS
Your vote of support is significant to us and it helps us keep the content FREE.
One click below supports our mission to supply free, deep, and relevant content.
Join our community on YouTube
Join the community that features greater than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and plenty of more luminaries and experts.
THANK YOU