Amazon on Monday said it will invest up to $4 billion in artificial intelligence (AI) startup Anthropic with a minority ownership position in the company, as competition grows in the growing generative AI market ruled by OpenAI’s ChatGPT.
Anthropic will use Amazon Web Services (AWS) Trainium and Inferentia chips to build, train, and deploy its future foundation models, benefitting from the price, performance, scale, and security of AWS.
The two companies will also collaborate in the development of future Trainium and Inferentia technology.
“Customers are quite excited about Amazon Bedrock, AWS’s new managed service that enables companies to use various foundation models to build generative AI applications on top of, as well as AWS Trainium, AWS’s AI training chip, and our collaboration with Anthropic should help customers get even more value from these two capabilities,” said Andy Jassy, Amazon CEO.
AWS will become Anthropic’s primary cloud provider for mission-critical workloads, including safety research and future foundation model development.
Anthropic plans to run the majority of its workloads on AWS, further providing Anthropic with the advanced technology of the world’s leading cloud provider.
Amazon developers and engineers will be able to build with Anthropic models via Amazon Bedrock so they can incorporate generative AI capabilities into their work, enhance existing applications, and create net-new customer experiences across Amazon’s businesses, the companies said in a statement.
“We are excited to use AWS’s Trainium chips to develop future foundation models,” said Dario Amodei, co-founder and CEO of Anthropic.
“By significantly expanding our partnership, we can unlock new possibilities for organizations of all sizes, as they deploy Anthropic’s safe, state-of-the-art AI systems together with AWS’s leading cloud technology,” he added.
Anthropic’s state-of-the-art model, Claude 2, scores above the 90th percentile on the GRE reading and writing exams, and similarly on quantitative reasoning.
At the bottom layer, AWS continues to offer compute instances from NVIDIA as well as AWS’s own custom silicon chips, AWS Trainium for AI training, and AWS Inferentia for AI inference.
At the middle layer, AWS is focused on providing customers with the broadest selection of foundation models from multiple leading providers.
“With today’s announcement, customers will have early access to features for customizing Anthropic models, using their own proprietary data to create their own private models, and will be able to utilize fine-tuning capabilities via a self-service feature within Amazon Bedrock,” said Amazon.
Bijay Pokharel
Related posts
Recent Posts
Subscribe
Cybersecurity Newsletter
You have Successfully Subscribed!
Sign up for cybersecurity newsletter and get latest news updates delivered straight to your inbox. You are also consenting to our Privacy Policy and Terms of Use.