In this blog, we discuss Nvidia’s latest earnings release and the three eras of A.I.

Nvidia’s terrible quarter

This past week, Nvidia reported its latest quarterly earnings, and the results were bad. The company is facing a mixture of headwinds: 

  • A slowdown in gaming and in crypto (Nvidia’s GPU is used in Ethereum mining). In gaming, the slowdown is due to the economy reverting back to mean from pandemic boom spending. In crypto, the combination of price collapse (forcing miners to liquidate their equipment) and the transition of Ethereum has led to proof-of-work to proof-of-stake. 
  • A new Chinese chip export ban was recently introduced by the US government, which is designed to prohibit Beijing from acquiring the latest chip technology, including A.I. chips. Nvidia estimates that this would cost the company about $400 million in sales. To circumvent these restrictions, the company recently released a weaker version of its state-of-the-art product for release in China. 

As a result, revenue fell 17% year-over-year, to $5.93 billion. In particular, the gaming segment was hit hard, decreasing 51% year-over-year, slowing down to revenue levels not seen since early 2021 (see Figure 1 below). The sales slowdown also means that the company is holding a record level of inventory ($4.45 billion worth or 67% of revenue this past quarter!).

Figure 1: Nvidia’s quarterly revenue trend by segment. Source

Another way to look at Nvidia’s business is by looking at the year-over-year top-line trends (Figure 2).

Figure 2

On closer examination, you can see that the story of Nvidia from the past two years can be divided into two arcs:

  • The pandemic boom era: After the initial global Covid lockdown, a pandemic boom ensued. Work from home, demand for gaming, and the crypto rally contributed to significant growth in the Professional Visualization and Gaming Segment. Demand outstripped supply. Nvidia could not make enough GPUs to meet demand. 
  • The Fed hiking interest rate era: Then came a period of sustained high inflation, which prompted the Fed to start increasing interest rates. This gave rise to the headwinds mentioned at the beginning of this post. The growth of most of its segments slowed, except for the Data Center segment (the gray line in Figure 2). Note that the Automotive segment, the yellow line, is showing some growth, but represents a very small component of the business (currently, only 4% of revenue).

For the upcoming quarter, Nvidia forecasts revenue of $6 billion. It is likely that the slow down in the Chinese market and gaming segment will continue. Having said that, there is a silver lining. 

Nvidia’s bright spot: the Data Center

The Data Center segment now contributes to about two-thirds of total sales ($3.8 billion, growing 31% year-over-year). This segment has enabled Nvidia to weather the current gaming slowdown more effectively than in 2018 (back then, it also struggled with slowing GPU sales and growing inventory due to the crypto winter at the time). 

The Data Center business is riding secular growth in A.I. and Accelerated Computing, and in this space, Nvidia dominates. You can broadly look at the segment from two lenses:

  • Supercomputers: The latest list of the top 500 supercomputers was just released, and 72% of the systems are powered by Nvidia GPUs (with AMD as the more popular CPU of choice – another negative sign for Intel). 
  • Cloud providers: Nvidia has about 85% market share within the six largest cloud infrastructure providers.

The secular trend: A.I. and Accelerated Computing

In the Q3 earnings call, Jensen Huang, Nvidia’s CEO explained that the company is riding on two massive secular shifts.

The first is that general-purpose computing is no longer scaling. It used to be that software and computing mostly relied on the CPU, which thanks to Moore’s law, doubled in computing power (roughly) every two years. But that scaling progress slowed down sometime around the turn of the last decade (see Figure 3 below). Fortunately, the scaling progress continued. Researchers discovered that GPUs are really good for solving specific types of problems (more on this later), and this opened up a whole new market for Nvidia. 

Figure 3: Accelerated computing is able to continue to drive compute scaling, post-CPU era. Source

The second is the rapid advancement of A.I. applications, which require a vast amount of GPUs. 

The three eras of A.I.

The development of machine learning and A.I. dates back to the 1950s. You can roughly split the development phases into three distinct eras (Figure 4 below). 

Figure 4: Training compute (FLOPs) of milestone Machine Learning (ML) systems over time. Source

The first era, the Pre-Deep Learning era, from 1952 to 2010 (orange): This is the first era and is the slowest of the three in terms of progress, with an average compute doubling time of 21.3 months. 

The second era, the Deep-Learning era, from 2010 to 2022 (blue): Notice in Figure 4 how the slope of the line becomes much steeper. Researchers discovered that GPUs are far more capable for these types of applications. Thanks to GPUs (Nvidia’s specifically) the rate of progress doubled. 

Researchers call this era the regular-scale models. These models were mostly perception models, used to perceive the world (analytical tasks, classifications, translations, and recommendations). These models were not good enough to generate new output (text, image, music, etc). 

Then came the third era (starting around 2015 or so with AlphaGo), the Large-Scale era (red). These large scale models are typically 100x-1,000x larger than the previous era. These are the models that are all over the news and are called generative A.I. models. They are now sufficiently capable to generate art, text, software code, music, etc. 

Who will profit in the third era?

In a recent article, we discussed that there’s a gold rush in these generative A.I. applications. The appeal is that these tools are now producing something that end-consumers can experience. But, despite Venture Capital’s enthusiasm for funding A.I. companies, there’s an open question as to whether these applications are features or standalone products, and where the value will accrue. If you look back at most A.I. applications that have been widely adopted, most provide utility in the background.

In the first era, the most widely adopted application was probably the optical character recognition system (OCR). The US Postal Service implemented OCR to read addresses off envelopes as early as 1965, to increase mail processing throughput (see an old-school informational video here). 

In the second era, most of these models are embedded in tools used by incumbent tech companies to increase efficiency. Google evolved search from giving you relevant links to answering your queries. Netflix uses a recommendation engine to provide you relevant content. Facebook uses it to keep you engaged and serve you advertisements, and email systems use it for better spam filtering. But there’s no dominant A.I.-first company that emerged. 

What about the third era? In business, it is always the case that value (and therefore profits) accrue at the point where power is concentrated (or in other words, creating the thing that everyone else in the value chain relies on). Here’s a simplified A.I. value chain in the 3rd era (Figure 5). 

Figure 5: A simplified A.I. value chain in the 3rd era.

While, we are still in the early stages of the third era, there are some interesting developments:

  • The data: In the early years, efforts to acquire, clean, and properly label vast amounts of data to train A.I. models were high, and therefore only large companies could afford to do so. But, surprisingly, recent progress in the third era suggests that this is becoming less important. The superior approach seems to be to train models with even more data, even if they’re unfiltered. This makes training large models easier
  • The accelerator hardware of choice to train still seems to be Nvidia’s.
  • The training cost is going down rapidly. Open AI’s GPT-3, one of the earliest and most prominent large language models, cost $12 million to train initially. Meanwhile, it only costs roughly $600,000 to train Stable Diffusion, a later entrant to the space.
  • The models are increasingly becoming more accessible. While some companies do not share access to their state-of-the-art A.I. models (Google and Facebook), some provide access via APIs (Open AI, Midjourney) so that other companies can build on top of them, while others provide open source access (Stable Diffusion).
  • More applications incorporate A.I. tools. Notion recently announced their A.I. features, while Descript (an audio and video editing tool) is incorporating Open AI’s APIs into its feature set, to name a few. See Figure 6 for a landscape map. 
Figure 6: Generative A.I. map. Source

If you map the players in the value chain, it looks like the following (Figure 7).

Figure 7: Generative A.I. value chain with the players

So where will the value accrue? Likely in the place where most differentiation occurs. A rapid increase in the number of consumer A.I. applications will increase demand for APIs provided by large scale model providers. These large scale model providers will compete with one another to continue to improve their models and lower costs. They will turn to the cloud infrastructure providers for more GPUs. And there are only two public companies that sell these GPUs at scale: Nvidia and AMD, and Nvidia, so far, has the larger market share because it has the superior software (CUDA) and hardware stack. 

As is the case in past gold rushes, it is often best to sell pickaxes during gold rushes (or in the case of the California gold rush of 1849, jeans).

Was this post helpful?

Ready to begin your US investment journey?

Sign up with Vested today.

Sign up now

Our team members at Vested may own investments in some of the aforementioned companies/assets. Different types of investments involve varying degrees of risk, and there can be no assurance that any specific investment or strategy will be suitable or profitable for an investor’s portfolio. Note that past performance is not indicative of future returns. Investing in the stock market carries risk; the value of your investment can go up, or down, returning less than your original investment. Tax laws are subject to change and may vary depending on your circumstances.

This article is meant to be informative and not to be taken as an investment advice, and may contain certain “forward-looking statements,” which may be identified by the use of such words as “believe,” “expect,” “anticipate,” “should,” “planned,” “estimated,” “potential” and other similar terms. Examples of forward-looking statements include, without limitation, estimates with respect to financial condition, market developments, and the success or lack of success of particular investments (and may include such words as “crash” or “collapse”). All are subject to various factors, including, without limitation, general and local economic conditions, changing levels of competition within certain industries and markets, changes in interest rates, changes in legislation or regulation, and other economic, competitive, governmental, regulatory and technological factors that could cause actual results to differ materially from projected results.

This video is meant to be informative and not to be taken as an investment advice and may contain certain “forward-looking statements” which may be identified by the use of such words as “believe”, “expect”, “anticipate”, “should”, “planned”, “estimated”, “potential” and other similar terms. Examples of forward-looking statements include, without limitation, estimates with respect to financial condition, market developments, and the success of or lack of success of particular investments (and may include such words as “crash” or “collapse”.) All are subject to various factors, including, without limitation, general and local economic conditions, changing levels of competition within certain industries and markets, changes in interest rates, changes in legislation or regulation, and other economic, competitive, governmental, regulatory and technological factors that could cause actual results to differ materially from projected results.

%d bloggers like this: