The competition between Nvidia Corp. and Broadcom Inc. has intensified with Broadcom’s recent launch of its Thor Ultra chip. Broadcom’s newly presented high-performance chip enables companies to build their own AI infrastructure.
The Thor Ultra chip allows operators of data centers and AI infrastructure to use significantly more processors than previous generations, enabling the development of larger and more sophisticated AI models such as those that power OpenAI’s ChatGPT.
Broadcom launches Thor Ultra
Broadcom Inc. has launched a new networking chip called Thor Ultra, designed to help companies build large-scale artificial intelligence (AI) computing systems by linking together hundreds of thousands of processors that handle data-intensive tasks.
Broadcom’s engineers, who are based at its San Jose network chip-testing labs, reportedly put the chip through extensive trials to evaluate power efficiency, thermal performance, and data throughput before release. The engineers also work closely with hardware teams to determine the packaging, power requirements, and cooling solutions for the chips.
The company’s Senior Vice President, Ram Velaga, spoke about the critical role of networking in AI workloads.
“Network plays an extremely important role in building these large clusters,” he said. Velaga added that it was unsurprising to see companies in the GPU business, such as Nvidia, also expanding their focus to include networking technologies.
Unlike Nvidia, Broadcom focuses on designing and producing high-performance chips and reference systems that customers can then use to build their own infrastructure. The company offers designs for both components and test systems, providing data center operators with blueprints for constructing robust AI networking frameworks.
The new chip strengthens Broadcom’s already significant presence in the AI data center industry. On Monday, the company announced that Broadcom had signed a deal to produce 10 gigawatts of custom chips for OpenAI, starting in the second half of 2026.
Nvidia faces competition in the AI market
AI has become a central focus of Broadcom’s long-term strategy. The company’s Chief Executive Hock Tan estimated late last year that the total potential value for Broadcom’s AI-related products, including networking chips and custom data center processors, could reach between $60B and $90B by 2027.
Broadcom’s AI-related revenue reached $12.2B in fiscal 2024. In September, the company revealed it had secured a new, unnamed $10B customer for its custom AI chips. This shows that there’s a continuous demand for Broadcom’s chips from cloud providers looking to diversify beyond Nvidia.
Broadcom collaborated with Google and co-developed multiple generations of Tensor Processing Units (TPUs), which are specialized chips designed to manage AI workloads. These collaborations have generated billions in revenue and positioned Broadcom as a vital supplier for large data centers.
With the industry racing to build larger and faster AI models, networking chips are becoming increasingly important in AI infrastructure.
For Nvidia, the rise of Thor Ultra represents fresh competition and raises questions about whether the company can maintain its lead in the AI computing space.
Sharpen your strategy with mentorship + daily ideas – 30 days free access to our trading program
This articles is written by : Nermeen Nabil Khear Abdelmalak
All rights reserved to : USAGOLDMIES . www.usagoldmines.com
You can Enjoy surfing our website categories and read more content in many fields you may like .
Why USAGoldMines ?
USAGoldMines is a comprehensive website offering the latest in financial, crypto, and technical news. With specialized sections for each category, it provides readers with up-to-date market insights, investment trends, and technological advancements, making it a valuable resource for investors and enthusiasts in the fast-paced financial world.