Broadcom’s AI Revenue Surges 106%, Challenging Nvidia
Broadcom's AI revenue surged 106% year-over-year to $8.4 billion, now representing 44% of its total business. The company is securing major deals for custom AI accelerators and networking, positioning itself as a key competitor to Nvidia by serving large hyperscalers like Google, Meta, and OpenAI.
Broadcom’s AI Surge: A New Contender in the AI Hardware Wars
Broadcom is emerging as a significant force in the artificial intelligence hardware landscape, with its AI-related revenues more than doubling year-over-year. The semiconductor giant is quietly securing substantial deals with major tech players like Google, Meta, OpenAI, and Anthropic, positioning itself as a formidable competitor to the current leader, Nvidia. This strategic move into custom AI accelerators and high-speed networking infrastructure suggests Broadcom is not just a component supplier but a key architect of the next generation of AI data centers.
Understanding Broadcom’s Dual Engine: Chips and Software
Broadcom operates on two primary business segments: semiconductors and software. In its semiconductor division, Broadcom designs advanced chips, including custom AI accelerators and critical networking components, which are then manufactured by partners like TSMC. These chips are integral to AI data centers, networking equipment, storage systems, and smartphones.
The company’s AI and data center business is particularly crucial for investors. Broadcom designs bespoke AI accelerators tailored to the specific models and infrastructure needs of large clients. Unlike Nvidia’s general-purpose Graphics Processing Units (GPUs), Broadcom’s custom silicon is optimized for individual customer requirements. Furthermore, Broadcom provides the high-speed networking fabric—specifically its Tomahawk and Jericho chips—that connects thousands of AI accelerators, enabling the massive data flow required for large-scale AI computations. These chips are the backbone of ultra-fast Ethernet switches and routers essential for “million-GPU” or “gigawatt-scale” AI clusters.
The second major pillar of Broadcom’s business is its infrastructure software segment, largely bolstered by its acquisition of VMware. VMware’s platform allows businesses to efficiently run multiple virtual machines on single physical servers and manage complex workloads across on-premises data centers and hybrid cloud environments. This segment generates significant recurring revenue with high profit margins, complementing the hardware division.
Broadcom vs. Nvidia: Customization vs. General Purpose
The competitive dynamic between Broadcom and Nvidia in the AI space centers on their distinct approaches. Nvidia offers widely available, general-purpose GPUs and complete server systems, making them the default choice for off-the-shelf AI compute power. In contrast, Broadcom focuses on building custom AI accelerators and networking solutions for a select group of large enterprise customers who require more control over cost, performance, and specific application tuning.
In the most recent quarter, Broadcom reported AI revenues of $8.4 billion, marking a substantial 106% increase year-over-year. Approximately one-third of this revenue came from AI networking, with the remaining two-thirds originating from custom compute solutions. While Nvidia’s GPUs are considered the industry standard for AI training, Broadcom is rapidly scaling its parallel offering of custom-tuned chips.
Key partnerships highlight Broadcom’s strategy: Google utilizes Broadcom for its Tensor Processing Units (TPUs), including those powering Gemini 3. Meta has collaborated with Broadcom for its in-house AI chips, aiming to reduce its reliance on Nvidia. Anthropic has signaled a significant multi-year deal worth approximately $21 billion for nearly a million TPUs and full rack-scale AI systems from Broadcom, establishing the latter as a primary custom compute partner. OpenAI has also announced plans with Broadcom to deploy 10 gigawatts of custom accelerators.
This strategy directly challenges Nvidia’s dominance. Every chip deployed by Broadcom for custom AI workloads represents a potential reduction in demand for Nvidia’s GPUs. While Nvidia commands roughly 90% of the data center GPU market, Broadcom holds approximately 70% of the custom AI accelerator market and around 80% of the market for data center Ethernet switch chips. In networking, Broadcom’s Tomahawk and Jericho chips are critical for high-bandwidth connectivity within and between data centers, powering switches and routers from major vendors.
Earnings Beat and Accelerating AI Growth
Broadcom’s latest earnings report revealed total revenue of $19.3 billion, a 29% increase year-over-year, slightly exceeding Wall Street estimates. However, the standout figure is its AI business performance. Out of the total revenue, $8.4 billion was attributed to AI, representing a 106% year-over-year surge and accounting for approximately 44% of Broadcom’s total revenue. This makes AI a larger revenue contributor for Broadcom than AMD’s entire AI business.
The Semiconductor Solutions segment generated $12.5 billion in revenue, up 52% year-over-year and making up about 65% of the company’s total business. The Infrastructure Software segment, driven by VMware, brought in $6.8 billion, with a modest 1% year-over-year increase but impressive gross margins of 93% and operating margins of 78%. This combination of high-growth AI chips and high-margin software contributes to Broadcom’s strong overall profitability.
Broadcom reported GAAP gross margins of 68% (or 77% adjusted), significantly higher than AMD and approaching Nvidia’s levels. Operating margins stood at 66.4%. Free cash flow for the quarter was $8 billion, representing 41% of total revenues, demonstrating robust financial health.
Forward Guidance and Long-Term Outlook
Broadcom’s forward guidance paints an even more bullish picture for its AI segment. The company projects approximately $22 billion in revenue for the next quarter, indicating a 47% year-over-year growth. Crucially, AI semiconductor revenue is expected to reach about $10.7 billion, implying a staggering 140% year-over-year growth. This suggests AI revenues will increase by 27% quarter-over-quarter and constitute roughly half of Broadcom’s total revenue.
The implications for investors are significant. AI is now Broadcom’s primary growth engine, and the company is scaling this business without compromising its high margins. The AI segment is not just growing but accelerating. Broadcom CEO Hock Tan has projected more than $100 billion in AI chip revenue by 2027, explicitly stating this figure excludes software and services. This implies a doubling of AI chip revenue from current levels by 2027.
Furthermore, Broadcom has secured its supply chain, including wafers, advanced packaging, and high-bandwidth memory, to support this ambitious target. The company’s total backlog exceeds $160 billion, with a substantial $73 billion directly tied to large orders from hyperscalers and AI labs for custom accelerators and networking products. This backlog indicates that a significant portion of projected AI revenue growth is already contracted, reducing near-term execution risk.
Market Impact and What Investors Should Know
Broadcom’s ascent presents a compelling narrative for investors seeking exposure to the AI boom. The company’s strategy of providing custom silicon and networking infrastructure addresses a critical need for hyperscalers aiming to optimize performance, control costs, and reduce reliance on a single vendor.
Key Risks to Consider:
- Customer Concentration: Broadcom’s AI revenue is heavily dependent on a small number of large clients (Google, Meta, OpenAI, Anthropic). A slowdown in spending, deployment delays, or a shift in strategy by any of these major customers could significantly impact Broadcom’s growth and stock performance.
- Margin Pressure: Custom chip development involves high upfront R&D costs amortized over fewer customers, potentially leading to lower margins compared to general-purpose products. The significant bargaining power of hyperscalers could also cap profit margins.
- System-Level Margins: When Broadcom sells full rack-scale AI systems, they often bundle third-party components at near-cost prices, which can dilute overall margins despite higher margins on their proprietary chips.
Nvidia’s advantage lies in its broad customer base for GPUs and its powerful CUDA software ecosystem, which creates significant switching costs for customers. Broadcom, by offering custom solutions and networking, competes by providing specialized performance and cost efficiencies for specific AI workloads.
Despite the risks, Broadcom’s management appears confident, projecting strong adjusted gross margins of 77%. However, a continued shift towards custom AI solutions and full-stack systems could exert pressure on these margins over time.
Conclusion: A Diversified AI Play
Broadcom is not merely a competitor to Nvidia; it represents a different facet of the AI infrastructure market. While Nvidia dominates the general-purpose GPU space, Broadcom excels in custom AI accelerators and the networking fabric connecting them. With AI comprising 44% of its revenue and projected to grow substantially, Broadcom offers investors a diversified exposure to the ongoing AI buildout.
The company’s $160 billion backlog, particularly the $73 billion in custom AI orders, signals that the AI spending cycle is still in its early stages. For long-term investors, Broadcom presents an opportunity to gain exposure to the AI revolution through custom silicon and networking, complementing an investment in Nvidia rather than replacing it. This dual approach allows investors to benefit from the AI market’s expansion regardless of specific technology preferences among hyperscalers.
Source: I Can't Stay Quiet on Broadcom (AVGO) vs NVIDIA Stock (NVDA) Any Longer (YouTube)





