September 20, 2024

Amazon bets $4 billion on Anthropic’s Claude, the chatbot that rivals ChatGPT and Google’s Bard

Anthropic #Anthropic

Head over to our on-demand library to view sessions from VB Transform 2023. Register Here

Amazon said Monday it will invest up to $4 billion in Anthropic, the company that has built a powerful chatbot called Claude. Claude had emerged as one of the leading competitors against Open AI’s ChatGPT and Google’s Bard, in the race to dominate generative AI.

Claude has not had such significant backing until now, but arguably has needed it because of the huge expense of remaining competitive in building the large language models (LLMs) technology that drives such chatbots.

The investment was part of a significant partnership announced by the two companies, whereby Anthropic agreed to use Amazon’s cloud platform for “mission-critical workloads” in return for the investment. The backing is the first major connection by Amazon to a leading chatbot, at a time when cloud competitors Microsoft and Google have already bet big on their respective chatbot platforms.

Indeed, the investment by Amazon goes against statements it has made in recent months about wanting to be agnostic with LLM companies — thought it’s possible that Amazon had been hankering to make a big move the entire time and had merely using the agnostic position as a way to justify it’s relative slowness to make such a bet.

Event

VB Transform 2023 On-Demand

Did you miss a session from VB Transform 2023? Register to access the on-demand library for all of our featured sessions.

Register Now

Also, it’s true that Amazon continues to have multiple horses in the LLM race, and so this investment maybe more to diversify its efforts, and also to ensure access to technology, talent and insight. With its announcement last week of Alexa LLM, Amazon is entering the race for commercial, closed-source models in parallel with its business of providing a platform for serving generative models (called Bedrock).

Microsoft has invested more than $10 billion in Open AI, to secure exclusive rights to offer up OpenAI’s chatbot technology within its own cloud services, including having OpenAI prioritize Microsoft’s Azure cloud platform. Meanwhile, Google has pushed its own Bard chatbot, and Meta has invested in its Llama platform, which it has open-sourced, so that other companies can use Llama’s foundation LLM technology. (Though Google invested $300 million in Anthropic in February and Anthropic chose GCP as its preferred cloud platform at the time).

It’s a major injection of support for Claude, at a time when cash is extremely important to fund the expensive work of training competitive large language models, which are using hundreds of billions of parameters and require massive computing needs. Claude had only raised about $2.7 billion to date.

Amazon and Anthropic said the new strategic collaboration will pool their technologies and expertise “in safer AI,” and will accelerate Anthropic’s development of foundation LLMs to make them widely accessible to AWS customers. One of Anthropic’s major selling points is that AI can be dangerous if not given the proper safeguards. Anthropic has invested heavily to ensure that its Claude chatbot foundation model abides by specific ground rules in producing ethical outputs, rooted in the principles of what it calls Constitutional AI. Anthropic has tried to open a perception gap here in comparisons with Open AI’s ChaptGPT, which is not as strict.

Here are the key elements of the agreement:

  • Anthropic will use AWS Trainium and Inferentia chips to build, train, and deploy its future foundation models, benefitting from the price, performance, scale, and security of AWS. The two companies will also collaborate in the development of future Trainium and Inferentia technology.
  • AWS will become Anthropic’s primary cloud provider for mission critical workloads, including safety research and future foundation model development. Anthropic plans to run the majority of its workloads on AWS, further providing Anthropic with the advanced technology of the world’s leading cloud provider.
  • Anthropic makes a long-term commitment to provide AWS customers around the world with access to future generations of its foundation models via Amazon Bedrock, AWS’s fully managed service that provides secure access to the industry’s top foundation models. In addition, Anthropic will provide AWS customers with early access to unique features for model customization and fine-tuning capabilities.
  • Amazon will invest up to $4 billion in Anthropic and have a minority ownership position in the company.
  • Amazon developers and engineers will be able to build with Anthropic models via Amazon Bedrock so they can incorporate generative AI capabilities into their work, enhance existing applications, and create net-new customer experiences across Amazon’s businesses.
  • VentureBeat recently published a story comparing Claude’s new professional version with ChatGPT’s pro version, and one clear distinction made was Claude’s ability to summarize content at a superior level — thanks to its industry-leading 100,000 token context window.

    In its statement, Amazon said the investment expands is overall generative AI offering at “all three layers of the generative AI stack:”

    “At the bottom layer, AWS continues to offer compute instances from NVIDIA as well as AWS’s own custom silicon chips, AWS Trainium for AI training and AWS Inferentia for AI inference,” the company said. “At the middle layer, AWS is focused on providing customers with the broadest selection of foundation models from multiple leading providers where customers can then customize those models, keep their own data private and secure, and seamlessly integrate with the rest of their AWS workloads—all of this is offered through AWS’s new service, Amazon Bedrock.” Today’s announcement falls into this middle later, the company said, by giving customers access to Anthropic’s customizable models, and allowing the to use their own proprietary data to create their own private models, and use fine-tuning capabilities via a self-service feature within Amazon Bedrock.

    Finally, at the top layer, AWS offers “generative AI applications and services for customers like Amazon CodeWhisperer, a powerful AI-powered coding companion, which recommends code snippets directly in the code editor, accelerating developer productivity as they code,” the company said.

    VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

    Leave a Reply