by Ian King and Vlad Savov
Nvidia Corp. Chief Executive Officer Jensen Huang unveiled technologies from faster chip systems to software aimed at sustaining the boom in demand for AI computing — and ensuring his products stay ahead as competition stiffens.
Huang on Monday kicked off Computex in Taiwan, Asia’s biggest electronics forum, touting new products and cementing ties with a region vital to the tech supply chain. The CEO introduced updates to the ecosystem around Nvidia’s accelerator chips, which are key to developing and running AI services. The central goal is to broaden the reach of Nvidia products and get more industries and countries to adopt AI.
Nvidia is keen to shore up its place at the heart of the artificial intelligence boom, at a time investors and some executives remain uncertain whether spending on datacenters is sustainable. The tech industry is also confronting profound questions about how the Trump administration’s tariffs regime will shake up global demand and manufacturing.
Still, Nvidia’s shares are riding a rally following a dealmaking trip to the Middle East as part of a delegation led by President Donald Trump. On Monday, shares in the company’s two most important Asian partners, Taiwan Semiconductor Manufacturing Co. and Hon Hai Precision Industry Co., slid more than 1% in a reflection of broader market weakness.
Huang opened Computex with an update on timing for Nvidia’s next-generation GB300 systems, which he said are coming in the third quarter of this year. They’ll mark an upgrade on the current top-of-the-line Grace Blackwell systems, which are now being installed by cloud service providers.
On Monday, he made sure to thank the scores of suppliers from TSMC to Foxconn that help build and distribute Nvidia’s tech around the world.
“When new markets have to be created, they have to be created starting here, at the center of the computer ecosystem,” Huang, 62, said about his native island.
What Bloomberg Intelligence Says
The readiness of GB300 server ramp-ups in 2H will be a key focus. We think broader AI server demand outlooks will also face scrutiny amid ongoing economic and geopolitical uncertainties.
- Steven Tseng, analyst
While Nvidia remains the clear leader in the most advanced AI chips, competitors and partners alike are racing to develop their own comparable semiconductors, whether to gain market share or widen the range of prospective suppliers for these pricey, high-margin components.
At Computex, Huang introduced a new RTX Pro Server system, which he said offered four times better performance than Nvidia’s former flagship H100 AI system with DeepSeek workloads. The RTX Pro Server is also 1.7 times as good with some of Meta Platforms Inc.’s Llama model jobs. That new product is in volume production now, Huang said.
The chipmaker is offering a new version of complete computers that it provides to data center owners. NVLink Fusion products will allow customers the option to either use their own central processor units with Nvidia’s AI chips or use Nvidia’s CPUs with another provider’s AI accelerators.
To date, Nvidia has only offered such systems built with its own components. This opening-up of its designs — which include crucial connectivity components that ensure a high-speed link between processors and accelerators — gives Nvidia’s data center customers more flexibility and allows a measure of competition while still keeping Nvidia technology at the center.
Major customers such as Microsoft Corp. and Amazon.com Inc. are trying to design their own processors and accelerators, and that risks making Nvidia less essential to data centers.
MediaTek Inc., Marvell Technology Inc. and Alchip Technologies Ltd. will create custom AI chips that work with Nvidia processor-based gear, Huang said. Qualcomm Inc. and Fujitsu Ltd. plan to make custom processors that will work with Nvidia accelerators in the computers.
The company’s smaller-scale computers — the Spark and the Station, which were announced earlier this year — are going to be offered by a broader range of suppliers. Local partners Acer Inc., Gigabyte Technology Co. and others are joining the list of companies offering the portable and desktop devices starting this summer, Nvidia said. That group already includes Dell Technologies Inc. and HP Inc.
Local companies, including TSMC, have used software and services offered under Nvidia’s Omniverse platform to create so-called digital twins of their factories.
The process has helped speed up the upgrading and fine-tuning of facilities that are crucial to the computer supply chain, Nvidia said. The US chip firm also announced new “dream” software that will allow for more rapid training of robots through fine-tuned simulation scenarios.
Nvidia said it’s offering detailed blueprints that will help accelerate the process of building “AI factories” by corporations. It will provide a service to allow companies that don’t have in-house expertise in the multistep process of building their own AI data centers to do so.
Nvidia also announced a new piece of software called DGX Cloud Lepton. This will act as a service to help cloud computing providers, such as CoreWeave Inc. and SoftBank Group Corp., automate the process of hooking up AI developers with the computers they need to create and run their services. Huang’s company is trying to create what’s essentially a virtual global marketplace for AI computing.
Copyright Bloomberg News
A Friday evening markdown by the Big Four credit rating agency is compounding risks from tariff threats and long-simmering fiscal issues.
Veteran leader from Resolute Investment Managers breathes new life into the New York-based Dynasty Financial Partner firm's leadership.
Rare Sunday night panel meeting agrees to proceed.
Wall Street anticipates surge in dealmaking.
Big hitters believe the tide is turning at last.
How intelliflo aims to solve advisors' top tech headaches—without sacrificing the personal touch clients crave
From direct lending to asset-based finance to commercial real estate debt.