โŒ

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

A chip company you probably never heard of is suddenly worth $1 trillion. Here's why, and what it means for Nvidia.

18 December 2024 at 01:00
Broadcom CEO Hock Tan speaking at a conference
Broadcom CEO Hock Tan

Ying Tang/NurPhoto via Getty Images

  • Broadcom's stock surged in recent weeks, pushing the company's market value over $1 trillion.
  • Broadcom is crucial for companies seeking alternatives to Nvidia's AI chip dominance.
  • Custom AI chips are gaining traction, enhancing tech firms' bargaining power, analysts say.

The rise of AI, and the computing power it requires, is bringing all kinds of previously under-the-radar companies into the limelight. This week it's Broadcom.

Broadcom's stock has soared since late last week, catapulting the company into the $1 trillion market cap club. The boost came from a blockbuster earnings report in which custom AI chip revenue grew 220% compared to last year.

In addition to selling lots of parts and components for data centers, Broadcom designs and sells ASICs, or application-specific integrated circuits โ€” an industry acronym meaning custom chips.

Designers of custom AI chips, chief among them Broadcom and Marvell, are headed into a growth phase, according to Morgan Stanley.

Custom chips are picking up speed

The biggest players in AI buy a lot of chips from Nvidia, the $3 trillion giant with an estimated 90% of market share of advanced AI chips.

Heavily relying on one supplier isn't a comfortable position for any company, though, and many large Nvidia customers are also developing their own chips. Most tech companies don't have large teams of silicon and hardware experts in house. Of the companies they might turn to design them a custom chip, Broadcom is the leader.

Though multi-purpose chips like Nvidia's and AMD's graphics processing units are likely to maintain the largest share of the AI chip market in the long-term, custom chips are growing fast.

Morgan Stanley analysts this week forecast the market for ASICs to nearly double to $22 billion next year.

Much of that growth is attributable to Amazon Web Services' Trainium AI chip, according to Morgan Stanley analysts. Then there are Google's in-house AI chips, known as TPUs, which Broadcom helps make.

In terms of actual value of chips in use, Amazon and Google dominate. But OpenAI, Apple, and TikTok parent company ByteDance are all reportedly developing chips with Broadcom, too.

ASICs bring bargaining power

Custom chips can offer more value, in terms of the performance you get for the cost, according to Morgan Stanley's research.

ASICs can also be designed to perfectly match unique internal workloads for tech companies, accord to the bank's analysts. The better these custom chips get, the more bargaining power they may provide when tech companies are negotiating with Nvidia over buying GPUs. But this will take time, the analysts wrote.

In addition to Broadcom, Silicon Valley neighbor Marvell is making gains in the ASICs market, along with Asia-based players Alchip Technologies and Mediatek, they added in a note to investors.

Analysts don't expect custom chips to ever fully replace Nvidia GPUs, but without them, cloud service providers like AWS, Microsoft, and Google would have much less bargaining power against Nvidia.

"Over the long term, if they execute well, cloud service providers may enjoy greater bargaining power in AI semi procurement with their own custom silicon," the Morgan Stanley analysts explained.

Nvidia's big R&D budget

This may not be all bad news for Nvidia. A $22 billion ASICs market is smaller than Nvidia's revenue for just one quarter.

Nvidia's R&D budget is massive, and many analysts are confident in its ability to stay at the bleeding edge of AI computing.

And as Nvidia rolls out new, more advanced GPUs, its older offerings get cheaper and potentially more competitive with ASICs.

"We believe the cadence of ASICs needs to accelerate to stay competitive to GPUs," the Morgan Stanley analysts wrote.

Still, Broadcom and chip manufacturers on the supply chain rung beneath, such as TSMC, are likely to get a boost every time a giant cloud company orders up another custom AI chip.

Read the original article on Business Insider

OpenAI is targeting 1 billion users in 2025 — and is building its own data centers to get there

2 December 2024 at 04:29
Sam Altman presenting onstage with the OpenAI logo behind him.
OpenAI CEO Sam Altman wants his company to build its own data centers.

Jason Redmond/AFP/Getty Images

  • OpenAI is seeking to reach 1 billion users by next year, a new report said.
  • Its growth plan involves building new data centers, company executives told the Financial Times.
  • The lofty user target signifies the company's growth ambitions following a historic funding round.

OpenAI is seeking to amass 1 billion users over the next year and enter a new era of accelerated growth by betting on several high-stakes strategies such as building its own data centers, according to a new report.

In 2025, the startup behind ChatGPT hopes to reach user numbers surpassed only by a handful of technology platforms, such as TikTok and Instagram, by investing heavily in infrastructure that can improve its AI models, its chief financial officer Sarah Friar told the Financial Times.

"We're in a massive growth phase, it behooves us to keep investing. We need to be on the frontier on the model front. That is expensive," she said.

ChatGPT, the generative AI chatbot introduced two years ago by OpenAI boss Sam Altman, serves 250 million weekly active users, the report said.

ChatGPT has enjoyed rapid growth before. It reached 100 million users roughly two months after its initial release thanks to generative AI features that grabbed the attention of businesses and consumers. At the time, UBS analysts said they "cannot recall a faster ramp in a consumer internet app."

Data center demand

OpenAI will require additional computing power to accommodate a fourfold increase in users and to train and run smarter AI models.

Chris Lehane, vice president of global affairs at OpenAI, told the Financial Times that the nine-year-old startup was planning to invest in "clusters of data centers in parts of the US Midwest and southwest" to meet its target.

Increasing data center capacity has become a critical global talking point for AI companies. In September, OpenAI was reported to have pitched the White House on the need for a massive data center build-out, while highlighting the massive power demands that they'd come with.

Altman, who thinks his technology will one day herald an era of "superintelligence," has been reported to be in talks this year with several investors to raise trillions of dollars of capital to fund the build-out of critical infrastructure like data centers.

Friar also told the FT that OpenAI is open to exploring an advertising model.

"Our current business is experiencing rapid growth and we see significant opportunities within our existing business model," Friar told Business Insider. "While we're open to exploring other revenue streams in the future, we have no active plans to pursue advertising."

In October, the company announced that it had raised $6.6 billion at a $157 billion valuation in a funding round led by venture capital firm Thrive Capital.

OpenAI said the capital would allow it to "double down" on its leadership in frontier AI research, as well as "increase compute capacity, and continue building tools that help people solve hard problems."

In June, the company also unveiled a strategic partnership with Apple as part of its bid to put ChatGPT in the hands of more users.

OpenAI did not immediately respond to BI's request for comment.

Read the original article on Business Insider

โŒ
โŒ