❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

A chip company you probably never heard of is suddenly worth $1 trillion. Here's why, and what it means for Nvidia.

18 December 2024 at 01:00
Broadcom CEO Hock Tan speaking at a conference
Broadcom CEO Hock Tan

Ying Tang/NurPhoto via Getty Images

  • Broadcom's stock surged in recent weeks, pushing the company's market value over $1 trillion.
  • Broadcom is crucial for companies seeking alternatives to Nvidia's AI chip dominance.
  • Custom AI chips are gaining traction, enhancing tech firms' bargaining power, analysts say.

The rise of AI, and the computing power it requires, is bringing all kinds of previously under-the-radar companies into the limelight. This week it's Broadcom.

Broadcom's stock has soared since late last week, catapulting the company into the $1 trillion market cap club. The boost came from a blockbuster earnings report in which custom AI chip revenue grew 220% compared to last year.

In addition to selling lots of parts and components for data centers, Broadcom designs and sells ASICs, or application-specific integrated circuits β€” an industry acronym meaning custom chips.

Designers of custom AI chips, chief among them Broadcom and Marvell, are headed into a growth phase, according to Morgan Stanley.

Custom chips are picking up speed

The biggest players in AI buy a lot of chips from Nvidia, the $3 trillion giant with an estimated 90% of market share of advanced AI chips.

Heavily relying on one supplier isn't a comfortable position for any company, though, and many large Nvidia customers are also developing their own chips. Most tech companies don't have large teams of silicon and hardware experts in house. Of the companies they might turn to design them a custom chip, Broadcom is the leader.

Though multi-purpose chips like Nvidia's and AMD's graphics processing units are likely to maintain the largest share of the AI chip market in the long-term, custom chips are growing fast.

Morgan Stanley analysts this week forecast the market for ASICs to nearly double to $22 billion next year.

Much of that growth is attributable to Amazon Web Services' Trainium AI chip, according to Morgan Stanley analysts. Then there are Google's in-house AI chips, known as TPUs, which Broadcom helps make.

In terms of actual value of chips in use, Amazon and Google dominate. But OpenAI, Apple, and TikTok parent company ByteDance are all reportedly developing chips with Broadcom, too.

ASICs bring bargaining power

Custom chips can offer more value, in terms of the performance you get for the cost, according to Morgan Stanley's research.

ASICs can also be designed to perfectly match unique internal workloads for tech companies, accord to the bank's analysts. The better these custom chips get, the more bargaining power they may provide when tech companies are negotiating with Nvidia over buying GPUs. But this will take time, the analysts wrote.

In addition to Broadcom, Silicon Valley neighbor Marvell is making gains in the ASICs market, along with Asia-based players Alchip Technologies and Mediatek, they added in a note to investors.

Analysts don't expect custom chips to ever fully replace Nvidia GPUs, but without them, cloud service providers like AWS, Microsoft, and Google would have much less bargaining power against Nvidia.

"Over the long term, if they execute well, cloud service providers may enjoy greater bargaining power in AI semi procurement with their own custom silicon," the Morgan Stanley analysts explained.

Nvidia's big R&D budget

This may not be all bad news for Nvidia. A $22 billion ASICs market is smaller than Nvidia's revenue for just one quarter.

Nvidia's R&D budget is massive, and many analysts are confident in its ability to stay at the bleeding edge of AI computing.

And as Nvidia rolls out new, more advanced GPUs, its older offerings get cheaper and potentially more competitive with ASICs.

"We believe the cadence of ASICs needs to accelerate to stay competitive to GPUs," the Morgan Stanley analysts wrote.

Still, Broadcom and chip manufacturers on the supply chain rung beneath, such as TSMC, are likely to get a boost every time a giant cloud company orders up another custom AI chip.

Read the original article on Business Insider

Amazon isn't seeing enough demand for AMD's AI chips to offer them via its cloud

6 December 2024 at 13:30
AWS logo at re:Invent 2024
AWS logo at re:Invent 2024

Noah Berger/Getty Images for Amazon Web Services

  • AWS has not committed to offering cloud access to AMD's AI chips in part due to low customer demand.
  • AWS said it was considering offering AMD's new AI chips last year.
  • AMD recently increased the sales forecast for its AI chips.

Last year, Amazon Web Service said it was considering offering cloud access to AMD's latest AI chips.

18 months in, the cloud giant still hasn't made any public commitment to AMD's MI300 series.

One reason: low demand.

AWS is not seeing the type of huge customer demand that would lead to selling AMD's AI chips via its cloud service, according to Gadi Hutt, senior director for customer and product engineering at Amazon's chip unit, Annapurna Labs.

"We follow customer demand. If customers have strong indications that those are needed, then there's no reason not to deploy," Hutt told Business Insider at AWS's re:Invent conference this week.

AWS is "not yet" seeing that high demand for AMD's AI chips, he added.

AMD shares dropped roughly 2% after this story first ran.

AMD's line of AI chips has grown since its launch last year. The company recently increased its GPU sales forecast, citing robust demand. However, the chip company still is a long way behind market leader Nvidia.

AWS provides cloud access to other AI chips, such as Nvidia's GPUs. At re:Invent, AWS announced the launch of P6 servers, which come with Nvidia's latest Blackwell GPUs.

AWS and AMD are still close partners, according to Hutt. AWS offers cloud access to AMD's CPU server chips, and AMD's AI chip product line is "always under consideration," he added.

Hutt discussed other topics during the interview, including AWS's relationship with Nvidia, Anthropic, and Intel.

An AMD spokesperson declined to comment.

Do you work at Amazon? Got a tip?

Contact the reporter, Eugene Kim, via the encrypted-messaging apps Signal or Telegram (+1-650-942-3061) or email ([email protected]). Reach out using a nonwork device. Check out Business Insider's source guide for other tips on sharing information securely.

Editor's note: This story was first published on December 6, 2024, and was updated later that day to reflect developments in AMD's stock price.

Read the original article on Business Insider

Amazon cloud executives share their latest AI strategies, and why choice matters more than owning the top model

6 December 2024 at 02:00
AWS CEO Matt Garman
AWS CEO Matt Garman

Noah Berger/Noah Berger

  • Amazon is emphasizing customer choice over market dominance with its AI strategy.
  • Amazon unveiled a new series of AI models called Nova this week.
  • Amazon's Bedrock tool supports diverse models from multiple providers, unlike OpenAI.

Amazon believes AI models are not in a winner-take-all market.

The company drilled down on this message during this week's re:Invent, the annual extravaganza for its Amazon Web Services cloud unit. Even after unveiling a new series of homegrown AI models called Nova, which, by some measures, are as powerful as other market leaders, Amazon stressed the goal is to provide more choice to customers.

AI models have become the new battleground for tech supremacy since OpenAI released its popular ChatGPT service in late 2022. Companies have rushed to up the ante, trying to outperform each other in model performance.

Amazon has largely been absent from this race. Instead, it has tried to stay neutral, arguing that the generative AI market is so big and varied that customers will want more model choices that fit their different needs. Amazon still believes this is the right approach.

"There are some that would want you to believe there's just this one magic model that could do everything β€” we never believed in it," Vasi Philomin, AWS's VP of generative AI, told Business Insider. "There'll be many, many winners and there are really wonderful companies out there building some amazing models."

Different positioning

As part of this, Amazon has used Bedrock, an AI development tool that gives access to many models, as its main horse in the AI race. This approach differed from OpenAI, and Meta, which mostly focused on building powerful models or chatbots. Google has a leading AI model in Gemini, but also provides access to other models through its Vertex cloud service, and Microsoft has a similar offering.

This week, Amazon further leaned into its strategy, announcing an array of new updates for Bedrock, including a marketplace for more than 100 specialized models and a distillation feature that fine-tunes smaller, more cost-effective models. It also unveiled new reasoning and "multi-agent" collaboration features that help build better models.

Swami Sivasubramanian, AWS's VP of AI and data, told BI that AWS "pioneered" the model-choice approach and intends to continue to promote it as a "core construct" of the business.

"GenAI is a lot bigger than a single chatbot or a single model to reach its full potential," Sivasubramanian said.

More companies appear to be taking the multi-model approach. According to a recent report by Menlo Ventures, companies typically use 3 or more foundation models in their AI services, "routing to different models depending on the use case or results."

As a result, Anthropic, which Menlo Ventures has backed, doubled its share in the AI model market to 24% this year, while OpenAI's share dropped from 50% to 34% year-over-year, according to the report.

AWS VP of AI and Data Swami Sivasubramanian
AWS VP of AI and Data Swami Sivasubramanian

Noah Berger/Noah Berger

'Choice matters'

Amazon may have no choice but to stick to this narrative. When OpenAI captivated the world with ChatGPT a couple of years ago, Amazon was caught flat-footed, leading to an internal scramble to find answers, BI previously reported. Its first in-house model, called Titan, drew little attention.

Having its own advanced, powerful AI models could help Amazon. It might attract the largest AI developers and promote AWS as the leader in the AI space. It would potentially also encourage those developers to continue building within AWS's broader cloud ecosystem.

Amazon isn't giving up on building its own advanced models. Last year, it created a new artificial general intelligence team under the mandate to build the "most ambitious" large language models. On Tuesday, Amazon unveiled the early results of that effort with its Nova series, which includes a multimodal model capable of handling text, image, and video queries.

Still, Amazon's CEO Andy Jassy downplayed any notion of Nova going after competitors. He said he's been surprised by the diversity of models developers use and that Nova is just one of the many options they will have.

"There is never going to be one tool to rule the world," Jassy said during a keynote presentation this week.

It's hard to know how successful this approach is as Amazon doesn't break out its AI revenue. But Jassy was even more bullish on the AI opportunity during October's call with analysts. He said AWS was now on pace to generate "multi-billion dollars" in AI-related revenue this year, growing at "triple-digit percentages year over year." Amazon's AI business is "growing three times faster at its stage of evolution than AWS did itself," he added.

Rahul Parthak, VP of data and AI, go-to-market, at AWS told BI that Nova's launch was partly driven by customer demand. Customers have been asking for Amazon's own model because some prefer to deal with one vendor that can handle every aspect of the development process, he said.

Amazon still wants other models to thrive because its goal isn't about beating competitors but offering customers "the best options," Parthak added. He said more companies, like Microsoft and Google, are following suit, offering more model choices via their own cloud services.

"We've been pretty thoughtful and clear about what we think customers need, and I think that's playing out," Parthak said.

Do you work at Amazon? Got a tip?

Contact the reporter, Eugene Kim, via the encrypted-messaging apps Signal or Telegram (+1-650-942-3061) or email ([email protected]). Reach out using a nonwork device. Check out Business Insider's source guide for other tips on sharing information securely.

Read the original article on Business Insider

Amazon plans to ramp up cloud work with Accenture and other consulting firms, according to internal document

3 December 2024 at 02:00
AWS CEO Matt Garman
AWS CEO Matt Garman

FREDERIC J. BROWN/AFP via Getty Images

  • AWS recently laid out growth plans for 2025 in internal documents.
  • One of the initiatives is focused on working more with consulting firms.
  • Accenture was among several consulting firms mentioned by AWS.

Amazon Web Services wants to work more with consulting firms, including Accenture, part of a broader plan to spur growth in 2025, according to an internal planning document obtained by Business Insider.

AWS is looking to expand work with external partners that can sell its cloud services to hundreds of their existing customers. AWS sees an untapped market worth $250 billion and thousands of contracts up for renewal, the document explained.

Beyond Accenture, AWS mentioned Tata Consultancy, DXC Technology, and Atos as partners in the planning document.

AWS will prioritize these partners' existing customers and proactively reach out to them before contract-renewal time, and help the partners become "cloud-first," the document explained.

AWS pioneered cloud computing and still leads this huge and growing market. Over the years, the company has done a lot of work with customers through in-house cloud advisers. So the plan to expand its relationships with outside consulting firms is notable.

Ruba Borno is the VP leading the initiative, which will "review and prioritize partner's incumbent customers based on workloads and relationship," the document also stated.

Borno is a Cisco veteran who joined AWS a few years go to run its global channels and alliances operation, which works with more than 100,000 partners, including consulting firms and systems integrators and software vendors.

These plans are part of new AWS growth initiatives that include a focus on healthcare, business applications, generative AI, and the Middle East region, BI reported last week.

These are part of the AWS sales team's priorities for next year and Amazon refers to them internally as "AGIs," short for "AWS growth initiatives," one of the internal documents shows.

A spokesman for Tata Consultancy declined to comment. Spokespeople at Accenture did not respond to a request for comment.

Read the original article on Business Insider

How Amazon revamped its AI-sales machine to compete with OpenAI, Microsoft, and Google

29 November 2024 at 07:08
Amazon Web Services CEO Matt Garman surrounded by AWS Logo, and graph with upward trending line

Amazon; Getty Images; Alyssa Powell/BI

  • AWS faces competition from OpenAI, Microsoft, and Google in AI, risking its cloud dominance.
  • It offers new financial incentives for AI sales. That comes with higher pressure and new demands.
  • AWS CEO Matt Garman wants his teams to move even faster.

This summer, Amazon Web Services rolled out a new internal campaign.

The initiative, called "Find One, Launch One, Ramp One," introduced goals, prizes, and other incentives for Amazon's huge cloud-support teams across North America.

The ultimate aim was to sell more of the company's new AI offerings. Sales architects, customer-success managers, and people in other roles were recruited into the broad push.

"This is a great time to partner with our sales teams for this #OneTeam effort," AWS said in an internal memo obtained by Business Insider.

These AWS staffers were asked to find at least one sales opportunity each month for Q, Amazon's artificial-intelligence assistant, and Bedrock, the company's AI platform.

Then, the initiative asked employees to launch one Bedrock or Q customer workload.

The final requirement, the "Ramp One" part, pushed teams to generate real revenue from these workloads.

AWS created a leaderboard for everyone to see the top performers. With December 1 as the deadline, the company dangled prizes, including an evening of pizza and wine at an executive's home (with guitar playing as a possibility).

A race for AI supremacy

This initiative is just one example of AWS trying to squeeze more out of its massive sales and support teams to be more competitive in AI. There's more pressure and urgency to sell AI products, along with new incentives, according to several internal documents and more than a dozen current and former AWS employees.

Messaging from AWS CEO Matt Garman, previously the cloud unit's top sales executive, is to move even faster, these people said. They asked not to be identified because they're not authorized to speak with the press. Their identities are known to BI.

Much is at stake for Amazon. OpenAI, Microsoft, and Google, alongside a slew of smaller startups, are all vying for AI supremacy. Though Amazon is a cloud pioneer and has worked on AI for years, it is now at risk of ceding a chance to become the main platform where developers build AI products and tools.

More pressure

The revamped sales push is part of the company's response to these challenges. As the leading cloud provider, AWS has thousands of valuable customer relationships that it can leverage to get its new AI offerings out into the world.

Many AWS sales teams have new performance targets tied to AI products.

One team has to hit a specific number of customer engagements that focus on AWS's generative-AI products, for instance.

There are also new sales targets for revenue driven by gen-AI products, along with AI-customer win rates and a goal based on the number of gen-AI demos run, according to one of the internal Amazon documents.

Another AWS team tracks the number of AI-related certifications achieved by employees and how many other contributions staff have made to AI projects, one of the people said.

Hitting these goals is important for Amazon employees because that can result in higher performance ratings, a key factor in getting a raise or promotion.

More employees encouraged to sell AI

Even people in roles that traditionally don't involve actively selling products are feeling pressure to find opportunities for AI sales, according to Amazon employees who spoke with BI and internal documents.

AWS software consultants, who mostly work on implementing cloud services, are now encouraged to find sales opportunities, which blurs the line between consultants and traditional salespeople.

The Find One, Launch One, Ramp One initiative includes AWS sales architects. These staffers traditionally work with salespeople to craft the right cloud service for each customer. Now they're incentivized to get more involved in actual selling and are measured by the results of these efforts.

"Customers are interested in learning how to use GenAI capabilities to innovate, scale, and transform their businesses, and we are responding to this need by ensuring our teams are equipped to speak with customers about how to succeed with our entire set of GenAI solutions," an AWS spokesperson told BI.

"There is nothing new or abnormal about setting sales goals," the spokesperson added in a statement. They also said that AWS sales architects were not "sellers" and that their job was to "help customers design solutions to meet their business goals."

There are "no blurred lines," the spokesperson said, and roles and expectations are clear.

Selling versus reducing customer costs

One particular concern among some AWS salespeople revolves around the company's history of saving cloud customers money.

Some staffers told BI that they now feel the company is force-feeding customers AI products to buy, even if they don't need them. The people said this represented a shift in AWS's sales culture, which over the years has mostly looked for opportunities to reduce customers' IT costs.

In some cases, salespeople have also been asked to boost the attendance of external AWS events. Several recent AWS-hosted AI events saw low attendance records, and salespeople were told to find ways to increase the number of registrations by reaching out to customers, some of the people said.

AWS's spokesperson said customer attendance had "exceeded our expectations for a lot of our AI events" and that the number of participants at the re:Invent annual conference "more than doubled."

The spokesperson also said the notion that Amazon had moved away from its goal of saving customers money was false. The company always starts with "the outcomes our customers are trying to achieve and works backwards from there."

A hammer and a nail

Garman, Amazon's cloud boss, hinted at some of these issues during an internal fireside chat in June, according to a recording obtained by BI. He said there were sales opportunities for AWS in "every single conversation" with a customer but that AWS must ensure those customers get real value out of their spending.

"Too often we go talk to customers to tell them what we've built, which is not the same thing as talking to customers," Garman said. "Just because you have a hammer doesn't mean the problem the customer has is the nail."

AWS's spokesperson said the company is "customer-obsessed" and always tries to consider decisions "from our customers' perspectives, like their ROI." The spokesperson added that some of AWS's competitors don't take that approach and that it's a "notable contrast," pointing to this BI story about a Microsoft customer complaining about AI features.

More pressure but also more rewards

Amazon is also doling out bonuses and other chances for higher pay for AI-sales success.

AWS recently announced that salespeople would receive a $1,000 performance bonus for the first 25 Amazon Q licenses they sell and retain for three consecutive months with a given customer, according to an internal memo seen by BI. The maximum payout is $20,000 per customer.

For Bedrock, Amazon pays salespeople a bonus of $5,000 for small customers and $10,000 for bigger customers when they "achieve 3 consecutive months of specified Bedrock incremental usage in 2024," the memo said.

Some AWS teams are discussing higher pay for AI specialists. Sales architects, for example, in AI-related roles across fields including security and networking could get a higher salary than generalists, one of the people said.

AWS's spokesperson told BI that every major tech company provides similar sales incentives. Amazon continually evaluates compensation against the market, the spokesperson added.

Fear of losing to Microsoft

Satya Nadella standing between the OpenAI and Microsoft logos.
Satya Nadella, Microsoft's CEO.

Justin Sullivan/Getty Images

Inside AWS, there's a general concern that Amazon was caught off guard by the sudden emergence of generative AI and is playing catch-up to its rivals, most notably Microsoft, the people who spoke with BI said.

Some Amazon employees are worried that Q is losing some customers to Microsoft's Copilot because of a lack of certain AI features, BI previously reported.

Microsoft has an advantage because of its wide variety of popular business applications, including its 365 productivity suite. That may make it easier for Microsoft to show customers how AI can improve their productivity, some of the Amazon employees told BI. AWS, meanwhile, has struggled to build a strong application business, despite years of trying.

AWS's spokesperson challenged that idea by noting that AWS has several successful software applications for companies, including Amazon Connect, Bedrock, and SageMaker. The spokesperson also said Amazon Q launched in April and was already seeing robust growth.

It's "no secret that generative AI is an extremely competitive space," the spokesperson added, saying: "However, AWS is the leader in cloud and customer adoption of our AI innovation is fueling much of our continued growth. AWS has more generative AI services than any other cloud provider, which is why our AI services alone have a multibillion-dollar run rate."

More speed

A major AWS reorganization earlier this year hasn't helped the AI-sales effort, some of the people who spoke with BI said.

The big change switched AWS to more of an industry focus rather than a regional one. That caused confusion inside the company, and some large customers lost their point of contact, the people said. AWS is still figuring out how to run as a more cohesive group, which has resulted in a slower sales cycle, they added.

AWS's spokesperson said it's inaccurate to say its sales process has slowed, adding that year-over-year revenue growth accelerated again in the most recent quarter and that the business was on pace to surpass $100 billion in sales this year.

In his June fireside chat, Garman stressed the importance of speed and told employees to "go faster."

"Speed really matters," Garman said. "And it doesn't necessarily mean work more hours. It means: How do we make decisions faster?"

Do you work at Amazon? Got a tip?

Contact the reporter, Eugene Kim, via the encrypted-messaging apps Signal or Telegram (+1-650-942-3061) or email ([email protected]). Reach out using a nonwork device. Check out Business Insider's source guide for other tips on sharing information securely.

Read the original article on Business Insider

Amazon makes massive downpayment on dethroning Nvidia

22 November 2024 at 11:09
Anthropic CEO Dario Amodei at the 2023 TechCrunch Disrupt conference
Dario Amodei, an OpenAI employee turned Anthropic CEO, at TechCrunch Disrupt 2023.

Kimberly White/Getty

  • Amazon on Friday announced another $4 billion investment in the AI startup Anthropic.
  • The deal includes an agreement for Anthropic to use Amazon's AI chips more.
  • The cloud giant is trying to challenge Nvidia and get developers to switch away from those GPUs.

Amazon's Trainium chips are about to get a lot busier β€” at least that's what Amazon hopes will happen after it pumps another $4 billion into the AI startup Anthropic.

The companies announced a huge new deal on Friday that brings Amazon's total investment in Anthropic to $8 billion. The goal of all this money is mainly to get Amazon's AI chips to be used more often to train and run large language models.

Anthropic said that in return for this cash injection, it would use AWS as its "primary cloud and training partner." It said it would also help Amazon design future Trainium chips and contribute to building out an Amazon AI-model-development platform called AWS Neuron.

This is an all-out assault on Nvidia, which dominates the AI chip market with its GPUs, servers, and CUDA platform. Nvidia's stock dropped by more than 3% on Friday after the Amazon-Anthropic news broke.

The challenge will be getting Anthropic to actually use Trainium chips in big ways. Switching away from Nvidia GPUs is complicated, time-consuming, and risky for AI-model developers, and Amazon has struggled with this.

Earlier this week, Anthropic CEO Dario Amodei didn't sound like he was all in on Amazon's Trainium chips, despite another $4 billion coming his way.

"We use Nvidia, but we also use custom chips from both Google and Amazon," he said at the Cerebral Valley tech conference in San Francisco. "Different chips have different trade-offs. I think we're getting value from all of them."

In 2023, Amazon made its first investment in Anthropic, agreeing to put in $4 billion. That deal came with similar strings attached. At the time, Anthropic said that it would use Amazon's Trainium and Inferentia chips to build, train, and deploy future AI models and that the companies would collaborate on the development of chip technology.

It's unclear whether Anthropic followed through. The Information reported recently that Anthropic preferred to use Nvidia GPUs rather than Amazon AI chips. The publication said the talks about this latest investment focused on getting Anthropic more committed to using Amazon's offerings.

There are signs that Anthropic could be more committed now, after getting another $4 billion from Amazon.

In Friday's announcement, Anthropic said it was working with Amazon on its Neuron software, which offers the crucial connective tissue between the chip and the AI models. This competes with Nvidia's CUDA software stack, which is the real enabler of Nvidia's GPUs and makes these components very hard to swap out for other chips. Nvidia hasΒ a decadelong head startΒ on CUDA, and competitors have found that difficult to overcome.

Anthropic's "deep technical collaboration" suggests a new level of commitment to using and improving Amazon's Trainium chips.

Though several companies make chips that compete with or even beat Nvidia's in certain elements of computing performance, no other chip has touched the company in terms of market or mind share.

Amazon's AI chip journey

Amazon is on a short list of cloud providers attempting to stock their data centers with their own AI chips and avoid spending heavily on Nvidia GPUs, which have profit margins that often exceed 70%.

Amazon debuted its Trainium and Inferentia chips β€” named after the training and inference tasks they're built for β€” in 2020.

The aim was to become less dependent on Nvidia and find a way to make cloud computing in the AI age cheaper.

"As customers approach higher scale in their implementations, they realize quickly that AI can get costly," Amazon CEO Andy Jassy said on the company's October earnings call. "It's why we've invested in our own custom silicon in Trainium for training and Inferentia for inference."

But like its many competitors, Amazon has found that breaking the industry's preference for Nvidia is difficult. Some say that's because ofΒ CUDA, which offers an abundant software stack with libraries, tools, and troubleshooting help galore. Others say it's simple habit or convention.

In May, the Bernstein analyst Stacy Rasgon told Business Insider he wasn't aware of any companies using Amazon AI chips at scale.

With Friday's announcement, that might change.

Jassy said in October that the next-generation Trainium 2 chip was ramping up. "We're seeing significant interest in these chips, and we've gone back to our manufacturing partners multiple times to produce much more than we'd originally planned," Jassy said.

Still, Anthropic's Amodei sounded this week like he was hedging his bets.

"We believe that our mission is best served by being an independent company," he said. "If you look at our position in the market and what we've been able to do, the independent partnerships we have Google, with Amazon, with others, I think this is very viable."

Read the original article on Business Insider

❌
❌