close
close

Nvidia’s data center real estate is under pressure, and it’s only a matter of time before the stock price reflects that

When it comes to artificial intelligence (AI)-powered data centers, a computational advantage isn’t everything.

About 30 years ago, the evolution of the Internet changed everything for corporate America. While it took companies a while to fully realize the potential that e-commerce brought, it has proven to be a game-changing innovation.

Since the mid-1990s, many technologies and trends have come and gone with the potential to change the long-term growth trajectory of businesses. After three decades of trends grappling in the shadow of the Internet, artificial intelligence (AI) has emerged as the next big leap forward in innovation.

The outline of a human face emerging from a sea of ​​pixels, a symbol of artificial intelligence.

Image source: Getty Images.

With AI, software and systems are tasked with performing tasks that humans would normally undertake or supervise. What gives the AI ​​revolution such stunning potential is the ability for AI-based software and systems to learn without human intervention and evolve over time. This evolution could include becoming more proficient at assigned tasks or perhaps learning new skills/activities.

Although estimates vary wildlywhich is completely normal for early-stage innovation, PwC analysts released a report last year calling for AI to add $15.7 trillion to the global economy by 2030. With such a huge addressable market, that means many companies could see major success.

It’s safe to say that no other company has taken the bull by the horns more than the semiconductor giant Nvidia (NVDA 0.69%).

Nvidia’s operational expansion has been virtually flawless

Even taking into account Nvidia’s nearly 20% pullback since hitting a record intraday high of over $140 per share on June 20, it has gained $2.45 trillion in market value in less than 19 months. No market leader has added so much value so quickly, which is likely why the company’s board approved a historic 10-for-1 stock split in late May.

Nvidia’s dominance in the data center market explains why it has become a thing of the past.

According to analysts at TechInsights, Nvidia accounted for 3.76 million of the 3.85 million graphics processing units (GPUs) shipped to data centers last year. For those of you keeping score at home, that’s a 98% market share, an effective monopoly on the “brains” that power AI-accelerated data center decision-making.

Beyond its first-mover advantage, Nvidia has clearly positioned itself for long-term success by maintaining its compute dominance. Currently, demand for its H100 GPU is off the charts. While competitors are busy trying to outdo the H100, Nvidia is preparing to launch its next-generation GPU platform, known as Blackwell, in the second half of 2024. Blackwell offers accelerated compute capabilities in half a dozen areas, including generative AI solutions and quantum computing.

In June, CEO Jensen Huang also hinted at the potential of the Ruby architecture, which is set to be released in 2026 and will run on a new processor known as Vera.

The company’s CUDA platform also plays a key role in its success. CUDA is a toolkit used by developers to build large language models. While much of the emphasis (rightfully) is on Nvidia’s industry-leading hardware in AI data centers, its software helps keep customers in an ecosystem of products and services.

The final piece of the puzzle for Nvidia was its extraordinary pricing power, which was fueled by demand for AI GPUs that clearly outstripped supply. When demand for a good or service outstrips supply, it’s only natural that its price will go up. Over the past five quarters (through April 28, 2024), Nvidia’s adjusted gross margin has increased by nearly 14 percentage points to 78.4%.

An engineer checks cables and switches in a data center server room.

Image source: Getty Images.

Nvidia is on the verge of facing a data center real estate problem

There’s absolutely no doubt that Wall Street’s leading tech firms and most influential enterprises are aggressively investing in AI-accelerated data centers. The question remains whether Nvidia will continue to gain a monopoly share of the GPU “real estate” in high-performance data centers.

This year, for the first time, Nvidia hardware will have to face real competition. Advanced Micro Devices is expanding production of its MI300X graphics processor, which sells for a significantly lower price per chip than Nvidia’s H100. Meanwhile, Intel It also plans to more broadly launch its AI-accelerating Gaudi 3 GPU in the second half of 2024.

While the AMD MI300X and Intel Gaudi 3 chips offer subtle advantages over the Nvidia H100 GPU, the Nvidia chips are expected to maintain their compute advantage.

But there’s something investors are missing: When it comes to AI-powered data centers, a computational advantage isn’t everything.

Nvidia currently can’t keep up with the huge demand for its chips. With AMD and Intel entering the picture, eager buyers are likely to snap up their much cheaper hardware. That means less real estate in high-performance data centers for Nvidia to claim.

And Nvidia doesn’t just have to worry about external competitors. In fiscal year 2024 Microsoft, Meta Platforms, AmazonAND Alphabet combined accounted for about 40% of Nvidia’s net sales. While it’s great news that the most influential firms on Wall Street are using Nvidia chips to power generative AI solutions and train large language models, all four companies are also developing AI chips for use in their data centers.

Microsoft’s Azure Maia 100, Alphabet’s Trillium, Amazon’s Trainium2, and Meta Training and Inference Accelerator are all expected to complement Nvidia’s GPUs in each company’s respective AI data centers. Even if Nvidia maintains its compute lead, these complementary chips will take valuable data center space out of the equation. Simply put, there will be less demand for Nvidia’s AI GPUs in the future.

The AI-GPU shortage has been the catalyst responsible for driving Nvidia’s pricing power skyward for over a year. As new chips flood the market and begin to secure valuable data center real estate, it’s almost certain that Nvidia’s pricing power, along with its adjusted gross margin, will weaken.

I will also add that most companies currently do not have a clear plan or knowledge on how to monetize their AI investments and generate a positive return.

As enterprise data centers grow and mature, Nvidia high will likely lose shares. For investors, this is a recipe for massive losses.

Suzanne Frey, an Alphabet executive, is a member of The Motley Fool’s board. Randi Zuckerberg, former chief market development officer and spokeswoman for Facebook and sister of Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board. John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board. Sean Williams holds positions in Alphabet, Amazon, Intel, and Meta Platforms. The Motley Fool holds positions in and recommends Advanced Micro Devices, Alphabet, Amazon, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends Intel and recommends the following options: long January 2025 $45 call options on Intel, long January 2026 $395 call options on Microsoft, short August 2024 $35 call options on Intel, and short January 2026 $405 call options on Microsoft. The Motley Fool has a disclosure policy.