Chip designer Nvidia has emerged as the clear winner not only in the early stages of the artificial intelligence boom but, at least so far, throughout the history of the stock market. The $1.9 trillion AI giant soared to a record price for its shares on Thursday, putting it on track to add more than $230 billion to its market capitalization and shatter a single-day record only old a few weeks: Meta’s $197 billion gain in early February.
It is dominating the market, selling more than 70% of all AI chips, and startups are desperate to spend hundreds of thousands of dollars on Nvidia’s hardware systems. Wall Street couldn’t get enough of it either: Nvidia shares soared 15% after the company smashed its ambitious profit targets last quarter, pushing its market capitalization to more than $1.9 trillion over to the value of its shares tripled in the last year alone.
Then why? How is it possible that a company founded way back in 1993 has taken the place of tech titans Alphabet and Amazon, leapfrogging them to become the third most valuable company in the world? It all comes down to Nvidia’s leading semiconductor chips for use in artificial intelligence.
The company that “understood”
Nvidia built its lead by playing the long game and investing in artificial intelligence years before ChatGPT hit the market, and its chip design is so far ahead of the competition that analysts wonder whether it’s possible for anyone else to catch up. delay. Designers like Arm Holdings and Intel, for example, have not yet integrated hardware with AI-focused software like Nvidia has.
“That’s one of the big observations we made: We realized that deep learning and artificial intelligence were not [just] a chip problem…Every aspect of computing has changed dramatically,” said Nvidia co-founder and CEO Jensen Huang at the New York Times DealBook summit last November. “We’ve been watching and realizing this about ten years and half ago. I think a lot of people are still trying to figure this out. Jensen said Nvidia “got it” before anyone else. “The reason people say we’re pretty much the only company doing this is because we’re probably the only company to get it. And people are still trying to get it.
Software was a key part of this equation. While competitors have focused their efforts on chip design, Nvidia has aggressively pushed its CUDA programming interface that runs on top of its chips. This dual emphasis on software and hardware has made Nvidia chips the indispensable tool for any developer looking to enter the world of artificial intelligence.
“Nvidia has done a masterful job of making it easier to run on CUDA than anything else,” said Edward Wilford, an analyst at technology consultancy Omdia. “CUDA is undoubtedly Nvidia’s flagship. It’s the thing that got them this far. And I think it’s going to carry them forward for a little while longer.”
Artificial intelligence needs computing powerVery of computing power. AI-powered chatbots like ChatGPT are trained by inhaling vast amounts of data from the internet, up to a trillion distinct pieces of information. This data is fed into a neural network that catalogs associations between various words and phrases which, after human training, can be used to produce natural language answers to users’ questions. All those trillions of data points require enormous amounts of hardware capacity, and demand for hardware is expected to increase as the field of artificial intelligence continues to grow. This put Nvidia, the industry’s largest seller, in a great position to take advantage.
Huang played a similar tune during his triumphant earnings call on Wednesday. Highlighting the shift from general-purpose computing to what he called “accelerated computing” in data centers, she argued that it is “a whole new way of doing computing” and even crowned it “a whole new industry.”
At the beginning of the artificial intelligence boom
Nvidia has been at the forefront of AI hardware since the beginning. When large-scale AI research by startups like OpenAI began to develop in the mid-2010s, Nvidia, through a mix of luck and smart bets, found itself in the right place at the right time.
Nvidia has long been known for its innovative GPUs, a type of chip popular for gaming applications. Most standard computer chips, called CPUs, excel at performing complicated calculations sequentially, one at a time. But GPUs can perform many simple calculations at once, making them excellent at supporting the complex graphics processing required by video games. As it turned out, Nvidia’s GPUs were perfect for the kind of computing systems that AI developers needed to build and train LLMs.
“In a way, you could say they were extremely lucky. But I think that understates it: They perfectly capitalized on every bit of luck in every opportunity they were given,” Wilford said. “If you go back five or 10 years, you see this increase in console gaming. They rode it, and then when they felt the wave growing, they got into cryptocurrency mining, and they rode it. And then, just as the wave crested, AI started to take off.”
In fact, Nvidia has been quietly developing AI-targeted hardware for years. Back in 2012, Nvidia chips formed the technical foundation of AlexNet, the first innovative neural network developed in part by OpenAI co-founder and former chief scientist Ilya Sutskever, who recently left the nonprofit after attempting to oust the CEO Sam Altman. This first-mover advantage has given Nvidia a big advantage over its competitors.
“They were visionaries…for Jensen, this goes back to his days at Stanford,” Wilford said. “He’s been waiting for this opportunity the whole time. And he kept Nvidia in a position to take advantage whenever the opportunity arose. What we have seen in recent years is that strategy has been executed to perfection. I can’t imagine anyone doing it better than Nvidia did.”
Since its first investments in artificial intelligence more than a decade ago, Nvidia has invested millions in a hugely profitable AI hardware business. The company sells its flagship Hopper GPU for a quarter of a million dollars per unit. It’s a 70-pound supercomputer, built from 35,000 individual parts, and the waiting list for customers to get their hands on one is months long. Desperate AI developers are turning to organizations like the San Francisco Compute Group, which rents the computing power of their collection of Nvidia chips by the hour. (At the time of publishing this article, they have been fully booked for almost a month.)
AI chip giant Nvidia is set to grow even more if AI growth meets analysts’ expectations.
“Nvidia maintained a seemingly very high level,” Goldman Sachs wrote in its analysis of Nvidia earnings. “We expect not only sustained growth in Gen AI infrastructure spending by large CSPs and consumer Internet companies, but also increased development and adoption of AI by enterprise customers representing various verticals and, again more, sovereign states.”
There are some potential threats to Nvidia’s market dominance. For one thing, investors noted in the company’s latest earnings that export restrictions to China have weakened business, and a potential increase in competition from Chinese chip designers could put pressure on Nvidia’s global market share. . And Nvidia also depends on Taiwanese chip foundry TSMC to actually produce many of the chips it designs. The Biden administration has pushed for more investment in domestic manufacturing through the CHIPS law, but Jensen himself has said it will be at least a decade before American smelters can be fully operational.
“[Nvidia is] heavily dependent on TSMC in Taiwan and there are regional complications [associated with that], there are political complications,” Wilford said. “[And] the Chinese government is investing heavily in developing its own AI capabilities because of some of those same tensions.”