Open Editor's Digest for free
Rula Khalaf, editor of the Financial Times, picks her favorite stories in this weekly newsletter.
Nvidia is betting on robotics as the next big driver of growth, as the world's most valuable semiconductor company faces increasing competition in its core business of making artificial intelligence chips.
The US tech giant, known for the infrastructure that fueled the artificial intelligence boom, is set to launch its latest generation of embedded computers for humanoid robots – dubbed Jetson Thor – in the first half of 2025.
Nvidia is positioning itself to be the leading platform for what the technology group believes is an imminent revolution in robotics. The company sells an end-to-end solution, from the software layers for training AI-powered robots to the chips that go into them.
“The ChatGPT moment for physical AI and robotics is just around the corner,” Deepu Talla, vice president of robotics at Nvidia, told the Financial Times, adding that he believes the market has reached a “tipping point.”
The move toward robotics comes as Nvidia sees more competition for its powerful AI chips from rival chipmakers like AMD, as well as cloud computing giants like Amazon, Microsoft and Google, which are seeking to reduce their dependence on the American semiconductor giant.
Nvidia, whose value rose to more than $3 trillion on the back of… Big demand For its AI chips, it has positioned itself as an investor in the field of “physical AI,” in an effort to help grow the next generation of robotics companies.
And in February it was One of several companiesincluding Microsoft and OpenAI, to invest in humanoid robotics company Figure AI at a valuation of $2.6 billion.
Robotics has so far remained an emerging field that has not yet achieved significant returns. Many startups in this field are struggling to scale, reduce costs, and increase the accuracy of robotics products.
Nvidia does not make sales of robotics products, but they currently represent a relatively small share of total revenue. Data center revenues, which include coveted GPU AI chips, accounted for about 88 percent of the group's total sales of $35.1 billion in the group's third quarter.
But Talla said the shift in the robotics market is being driven by two technological breakthroughs: the explosion of generative AI models and the ability to train robots on these basic models using simulated environments.
The latter represents a particularly important development because it helps solve what roboticists call the “Sim-to-Real gap,” ensuring that robots trained in virtual environments can operate effectively in the real world, he said.
“In the past 12 months… (this gap) has matured enough that we can now conduct experiments in simulation, combining with generative AI, which we could not have done two years ago,” Tala said. We provide a platform to enable all of these companies to do any of these tasks.”
Tala joined Nvidia in 2013 to work on its “Tegra” chip, initially aimed at the smartphone market. However, the company quickly pivoted, with Tala overseeing the redeployment of about 3,000 engineers into “artificial intelligence and autonomous training (for vehicles, for example).” This was the genesis of Jetson, Nvidia's line of robotic “brain” units that debuted in 2014.
Nvidia offers tools at three stages of robotics development: software for training basic models, which comes from Nvidia's “DGX” system; Simulating real-world environments in its “Omniverse” platform; And the hardware needed to get inside robots as their “mind.”
Apptronik, which uses Nvidia technology in its development of humanoid robots, also announced in December a strategic partnership with Google DeepMind to improve its products.
The global robotics market is currently valued at $78 billion, according to US market researchers BCC, and is expected to reach $165 billion by the end of 2029.
Amazon has already deployed Nvidia's robotics simulation technology in three of its U.S. warehouses, and Toyota and Boston Dynamics are among other customers using Nvidia's training program.
David Rosen, who leads the Robust Autonomy Lab at Northeastern University, said the robotics market still faces significant challenges, including training models and verifying that they will be safe when deployed.
“So far, we don't have very effective tools to verify the safety and reliability properties of machine learning systems, especially in robotics. This is a big open scientific question in this field,” Rosen said.