From: allin

Nvidia has delivered astounding financial results for the third consecutive quarter, significantly impacting the broader market. Shares of Nvidia were up 15% on Thursday, representing a nearly 247 billion single-day increase in market cap is the largest ever recorded, surpassing Meta’s previous record of adding $196 billion earlier in the year [02:51].

Unprecedented Financial Performance

Nvidia’s Q4 revenue reached 12.3 billion, a nine-fold increase year-over-year [03:29]. The gross margin expanded to 76%, up 2 points quarter-over-quarter and 127% year-over-year [03:33]. The company’s revenue ramp from 22 billion in Q1 2024 is described as “extraordinary” [03:40].

Nvidia also engaged in a significant share buyback, purchasing 25 billion buyback plan [05:37]. The company continues to project strong growth, with Q1 forecasts around $24 billion, which would represent a 3x increase year-over-year [06:00]. This performance has fueled a market surge, with the S&P 500 and NASDAQ reaching record highs [06:07].

The AI Boom and Data Centers

The primary driver behind Nvidia’s explosive growth is the booming demand for data centers, particularly for generative AI applications [04:00]. Historically known for gaming, professional visualizations, and automotive applications like self-driving, Nvidia’s GPU (Graphical Processing Unit) chips have found a perfect fit in the current AI advancements and the economic impact landscape [04:08].

GPUs excel at graphical processing, utilizing vector math to create 3D worlds [04:35]. This same vector math is crucial for AI to reach its outcomes [05:01]. With the explosion of Large Language Models (LLMs), these GPUs became the ideal chips for cloud service providers to build the massive data centers required to serve new AI applications [05:06]. Nvidia was “in the perfect place, the perfect time” [05:22].

Sustainability of Growth and Competition

Despite the impressive results, questions arise regarding the Nvidia’s growth and sustainability and whether these astounding rates can be maintained [11:01]. Comparisons are drawn to Cisco during the dot-com era, where Cisco’s stock price trajectory mirrored Nvidia’s, only to crash and never fully recover its peak valuation [11:10].

However, several distinctions are highlighted:

  • Nvidia’s multiples are currently much lower than Cisco’s were, indicating a valuation more grounded in real revenue, margins, and profit [11:47].
  • Nvidia possesses a stronger competitive moat. Cisco’s servers and networking equipment were easier to copy and commoditize, whereas Nvidia’s GPU chips are highly complex [12:06]. For instance, the Hopper 100 product is not just a chip but contains 35,000 components and weighs 70 lbs, functioning more like a dedicated mainframe computer [12:28].

The market share for GPUs is currently around 91% for Nvidia, though this is expected to decrease [21:03]. Wall Street analysts predict Nvidia could still hold over 60% market share in five years [21:13].

A significant portion of Nvidia’s revenue comes from major cloud service providers like Amazon, Google, Microsoft, and Meta [13:02]. These large tech companies are investing heavily in AI infrastructure due to their substantial cash reserves and challenges in growing through acquisitions (partly due to antitrust concerns) [18:19]. Buying these chips is booked as a capital expenditure on the balance sheet, depreciated over time, rather than an immediate expense on the income statement, making it an attractive way for these companies to deploy cash for future growth [18:08]. This suggests that some of the current demand is a “one-time buildout” rather than sustainable ongoing revenue [20:40].

However, the history of the internet suggests that infrastructure buildouts eventually lead to the development of new applications [24:07]. Similar to how fiber optic overinvestment in the late 90s eventually enabled streaming and social networking, the current AI infrastructure investment is expected to lead to a decade-long wave of new B2C and B2B applications [25:01].

Nvidia’s forecasting indicates continued growth, with projected revenue of 60 billion [21:55]. This visibility into demand suggests the buildout phase is far from over.

The Rise of LPUs and Groq

Amidst Nvidia’s dominance, new players are emerging to compete. Groq (with a Q), a company focused on LPUs (Language Processing Units), has gained significant attention for its speed and cost-effectiveness in AI inference [30:11]. While GPUs are excellent for “training” AI models (brute force computation over months), LPUs are optimized for “inference” (answering user questions quickly and cheaply) [29:12]. Groq’s chips are meaningfully faster and cheaper than Nvidia’s solutions for inference, indicating a potential for disruption in this segment of the market [30:46]. This highlights the ongoing evolution in chip design and specialization within the AI ecosystem.