Contact Us
News

Nvidia Earnings Show Strong But Shifting Data Center Demand

National Data Center

Strong quarterly numbers from artificial intelligence chip giant Nvidia show AI data center demand isn’t slowing down anytime soon. But the shape of the data center landscape could be shifting in the months ahead.

Placeholder
Nvidia CEO Jensen Huang at a conference in 2016

Nvidia reported quarterly earnings Wednesday that surpassed its own projections, with revenue growing 69% year-over-year to $44B. Data center sales, which account for most of Nvidia’s revenue, climbed 73% from last year as tech giants and other customers clamor for the firm’s newest Blackwell line of AI-focused graphics processing units.

The strong numbers come despite the firm missing out on what it said was $2.5B in sales due to U.S. restrictions on exporting the line of chips it manufactures specifically for China. The impact of those export restrictions is expected to climb to $8B next quarter, but Nvidia still anticipates quarterly revenue holding steady at $45B, with reduced China sales offset by the growth of its data center segment elsewhere. 

Wall Street reacted favorably to Nvidia’s earnings, with shares of the chipmaker jumping more than 5% after the market opened Thursday. 

“NVIDIA is putting digestion fears fully to rest, showing acceleration of the business other than the China headwinds around growth drivers that seem durable,” Morgan Stanley analyst Joseph Moore wrote. “Everything should get better from here.” 

The surging demand for Nvidia’s AI data center processors presents a concrete rebuttal to the narrative of uncertainty around hyperscale data center demand that emerged over the past two quarters.

A chorus of skeptics sounded alarm bells that tech firms and other major data center users are poised to pump the brakes on their previously insatiable appetites for data center capacity to house AI processors like those from Nvidia — fears amplified by reports this spring that Microsoft and Amazon had canceled leases and paused new data center projects.

But according to Nvidia, the opposite is happening.

Far from a slowdown, cloud providers and other tech giants are scaling up their purchases of Nvidia’s processors and deploying them faster than ever before. Major hyperscalers are deploying almost 72,000 Blackwell GPUs each week and are on track to ramp up the scale of those installations in the quarter ahead, Nvidia Chief Financial Officer Colette Kress said on the company's earnings call Wednesday.

She said Microsoft's fleet of Nvidia GPUs is on track to go from tens of thousands to hundreds of thousands in the coming months. 

This demand is translating directly into new data center development, Nvidia leadership said. Kress said there is a pipeline of more than 100 “AI factories” — the company’s preferred term for large-scale AI training data centers — in development, double the number a year ago. The size of those projects is also growing, with the average number of GPUs deployed at each project also doubling year-over-year. 

More AI megacampuses will be announced in the months ahead, representing gigawatts of capacity, according to Nvidia. 

“We have a line of sight to projects requiring tens of gigawatts of Nvidia AI infrastructure in the not-too-distant future,” Kress said. 

Beyond the implications for overall data center demand, Nvidia’s quarterly numbers may also serve as a leading indicator for a pair of major shifts in where and how new data centers are built.

The firm’s GPU sales suggest that a growing share of data center workloads will be for AI inference, with a larger percentage of new development occurring outside the U.S. 

Demand for AI processors and data center capacity has largely been dominated by the need to train AI models, but there has been steady growth in demand for AI inference, the computing through which end users interact with those models.

Now, Nvidia’s leadership says the shift toward inference is hitting an inflection point, with a “sharp jump in inference demand” in the last quarter that could alter the data center map. 

Data center capacity for inference computing often has different siting considerations compared to AI training deployments.

While AI training requires massive GPU clusters like those being spearheaded by OpenAI and xAI in Texas and Tennessee, respectively, it doesn't require close proximity to the major metro areas where most end users of those AI models are located. Conversely, inference deployments are often far smaller and prioritize proximity to end users to achieve fast processing speed and allow adequate connectivity. 

One hyperscaler, reported to be Meta, launched plans earlier this year to deploy 20 10-megawatt data centers for AI inference in major markets around the U.S. 

The sudden spike in inference demand has been driven by growing corporate AI adoption, through cloud providers, AI startups and other enterprises deploying their own inference, Nvidia CEO Jensen Huang said.

Huang said open-source models and newly efficient computing methods that have lowered adoption costs, as well as the rise of “reasoning” AI models from firms like OpenAI and DeepSeek that utilize more inference computing power, “are driving a step-function surge in inference demand,” Huang said.

“We've entered an era where inference is going to be a significant part of the compute workload.”

Haung also said that a growing number of AI data center projects are being developed outside the U.S. This accelerated area of growth is being driven in part by so-called sovereign AI — AI computing located in a specific market to comply with jurisdictional regulations or serve national governments.

Additionally, enterprise adoption by non-U.S. firms is growing, while major tech firms are increasingly seeing value in offering AI products to international markets built with their own data in their own languages. 

The vast majority of AI-driven data center development has been in the U.S., and while that isn't expected to change, Nvidia’s leadership has said its recently announced collaboration on massive data center megacampuses in Saudi Arabia and Abu Dhabi are the kind of projects that will become increasingly common. 

“We're clearly in the beginning of the build-out of this infrastructure, and every country will have it — I’m certain of that,” Huang said. “There should be many, many more announcements in the future.”