- NextWave AI
- Posts
- Elon Musk Predicts Space Will Become the Cheapest Home for AI Within 36 Months
Elon Musk Predicts Space Will Become the Cheapest Home for AI Within 36 Months
Want to get the most out of ChatGPT?
ChatGPT is a superpower if you know how to use it correctly.
Discover how HubSpot's guide to AI can elevate both your productivity and creativity to get more things done.
Learn to automate tasks, enhance decision-making, and foster innovation with the power of AI.
In a bold and futuristic prediction, billionaire entrepreneur Elon Musk has claimed that outer space could soon become the most economical location for artificial intelligence (AI) infrastructure. According to Musk, rapid advancements in energy generation and launch technology may make orbital data centers cheaper than those built on Earth within the next 30 to 36 months.
Musk shared this vision during an appearance on the Dwarkesh Podcast, where he discussed the growing challenges of scaling AI systems on Earth and why space may offer a more practical long-term solution. His argument centers on one crucial factor: energy.
Energy Bottlenecks on Earth
As AI models grow larger and more powerful, they require enormous amounts of electricity to operate. Musk emphasized that energy availability—not hardware reliability—is the primary constraint in expanding AI infrastructure.
He pointed out that global electricity production outside China has remained nearly flat, even as chip manufacturing and AI demand continue to surge. This imbalance, Musk argued, creates “insurmountable power bottlenecks” that could limit the future growth of AI if solutions are not found.
Running massive data centers on Earth also involves cooling systems, land acquisition, regulatory approvals, and the construction of new power plants—all of which add cost and complexity.
Why Space Could Be More Efficient
Musk believes space solves many of these problems at once. Solar panels in orbit can generate significantly more energy because they are not affected by weather, atmospheric interference, or the day-night cycle.
“The atmosphere alone results in about a 30% loss of energy,” Musk explained, noting that a solar panel can produce roughly five times more power in space than on the ground.
Additionally, orbiting systems would not require large battery installations to store energy overnight, further reducing costs. Continuous sunlight means nearly uninterrupted power generation—an ideal condition for energy-hungry AI workloads.
Because of these advantages, Musk predicts that “by far the cheapest place to put AI” will soon be space.
The Role of SpaceX and Starship
For Musk’s prediction to materialize, several technological milestones must align. One of the most important is achieving extremely high launch frequencies with SpaceX’s Starship rocket.
Musk suggested that if thousands of launches per year become possible, companies could deploy hundreds of gigawatts of computing power into orbit. In such a scenario, space-based AI capacity could eventually surpass all terrestrial infrastructure combined.
This strategy also fits into Musk’s broader ecosystem. SpaceX recently merged with his AI startup xAI to form a vertically integrated company focused on rockets, AI, communications, and real-time information platforms.
The merger signals a future where space technology and artificial intelligence increasingly intersect.
Hardware Reliability and Technical Concerns
When asked about the reliability of GPUs—the specialized chips used for AI training—Musk downplayed concerns. He stated that once hardware passes the initial debugging phase, it generally becomes reliable and does not require constant servicing.
Still, building data centers in orbit is far from simple. The technical difficulty of deploying computing infrastructure in space is significantly higher than constructing facilities on Earth, and supply chains for turbines and power equipment are already stretched.
However, Musk remains confident that falling launch costs and improvements in solar technology will offset these challenges.
A Growing Industry Trend
Interestingly, Musk is not alone in exploring space-based computing. Researchers and major tech firms are studying orbital AI systems powered entirely by solar energy.
For example, Google has proposed satellite clusters linked by ultra-high-bandwidth optical connections to function as distributed data centers, demonstrating the potential feasibility of such architectures.
These developments suggest that the race to build AI infrastructure may extend beyond Earth sooner than many expect.
Cooling Advantages and Environmental Factors
Space also offers natural thermal benefits. Musk noted that the extreme cold of space allows for efficient radiative cooling—an essential requirement for high-performance computing systems that generate intense heat.
With solar panels facing the sun and radiators pointed into deep space, orbital facilities could achieve highly efficient temperature control, further reducing operational costs.
The Bigger Picture
Musk’s prediction reflects a broader reality: AI is pushing the limits of today’s energy infrastructure. Data centers already consume vast amounts of electricity, and that demand is projected to grow dramatically in the coming decade.
If Earth cannot scale power generation fast enough, companies may be forced to look toward unconventional solutions—including extraterrestrial ones.
While the timeline of “36 months or less” may sound ambitious, Musk has built a reputation on pursuing ideas that once seemed improbable—from reusable rockets to global satellite internet.
Whether orbital AI becomes mainstream within three years or takes longer, one thing is clear: the future of computing may not be confined to our planet.
As Musk succinctly put it, the only place where power generation can truly scale without limits might ultimately be space.

