How Nvidia Is Building Computers Designed for AI Data Centers in Orbit
Space may well represent the next major frontier for AI infrastructure expansion — but getting there will require solving some formidable engineering challenges, Nvidia CEO Jensen Huang acknowledged during his keynote address Monday at the company's GTC conference in San Jose, California.
Although Nvidia already has silicon deployed aboard satellites, constructing a fully operational data center in orbit is an entirely different order of magnitude. "Obviously, very complicated to do so," Huang said, with characteristic understatement.
Nvidia is far from alone in setting its sights on orbital AI infrastructure. Elon Musk has repeatedly floated the idea of space-based data centers — a vision that gained structural coherence when he recently merged his AI venture with his rocket company, creating a vertically integrated pathway from compute to launch vehicle.
Read more: Nvidia GTC: All the AI and Robotics News From Jensen Huang's Keynote
The orbital environment does offer some genuinely compelling advantages for data center deployment. Regulatory friction largely disappears — there are no zoning boards or neighbors to contend with, no permitting delays, and no community opposition. Solar power is abundant and uninterrupted above the atmosphere. Physical space, at least in theory, is essentially limitless — though the rapid proliferation of satellites is making orbit increasingly congested, introducing new coordination and collision-avoidance complexities.
The most pressing technical obstacle confronting Nvidia as it develops its Space-1 Vera Rubin module computer is thermal management. On Earth, data centers rely on a combination of conductive and convective cooling — airflow, liquid cooling loops, and heat sinks all working in concert. In the vacuum of space, none of those mechanisms apply.
"In space, there's no conduction, there's no convection, it's just radiation," Huang said. "So we have to figure out how to cool these systems out in space."
Orbital data centers remain a longer-horizon ambition, but Nvidia's GTC announcements this week included several developments with much nearer-term trajectories. NemoClaw is a new technology stack designed to streamline deployment of the widely circulated OpenClaw AI agent software — though the power and complexity of that platform warrants careful consideration before adoption. On the more whimsical end of the spectrum, a partnership with Disney yielded a fully autonomous robotic Olaf — the beloved Frozen character — capable of navigating Disney's theme parks independently. And DLSS 5, Nvidia's latest AI-driven upscaling technology for games, arrived to a mixed reception: while it promises significant performance gains, a vocal segment of the gaming community raised concerns that the AI interpolation could compromise artistic fidelity and introduce visual artifacts that undercut developers' original creative intent.