Google Suncatcher Explores “Space-Based Data Centers”
Home > AI, Cloud & Data > Article

Google Suncatcher Explores “Space-Based Data Centers”

Photo by:   Mexico Business News
Share it!
Diego Valverde By Diego Valverde | Journalist & Industry Analyst - Wed, 11/05/2025 - 11:45

Google Research announced Project Suncatcher, a long-term research initiative to develop a solar-powered satellite constellation capable of scaling machine learning (ML) computation in space. The program seeks to address one of AI's most critical limitations: increasing energy consumption.

 “Our initial analysis shows that the basic concepts of space-based ML computation are not excluded by fundamental physics or insurmountable economic barriers,” says Google Research in its preprint paper. “However, significant engineering challenges remain, including thermal management, high-bandwidth ground communications, and in-orbit reliability.”

AI computation has become increasingly energy-intensive as models expand in complexity and size. According to Google Research, the operational and environmental costs of large terrestrial data centers could become unsustainable within the next decade. Industry projections estimate that global AI workloads could require up to 327GW of power by 2030, putting pressure on infrastructure and sustainability targets.

To mitigate these challenges, Google Research is exploring the feasibility of leveraging solar energy directly from low Earth orbit (LEO). The proposal aligns with industry-wide initiatives to diversify energy sources and increase computational efficiency for AI workloads. The company estimates that if launch costs and orbital technologies continue to advance, operating ML systems in orbit could become economically viable by the mid-2030s.

Suncatcher Details

Project Suncatcher envisions a swarm of interconnected satellites functioning as an orbital data processing network. Each unit would host AI-specific Tensor Processing Unit (TPU) chips and operate through high-speed optical links to enable synchronized computation across the constellation. The satellites would orbit in heliosynchronous formations, maintaining continuous exposure to solar radiation to maximize power generation.

Google Research conducted preliminary simulations modeling a cluster of 81 satellites distributed within a 1km radius, separated by 100m–200m. The formation demonstrated stable orbital behavior and sustained optical communication throughput of up to one terabit per second (Tbps) using commercially available technology. Laboratory testing also subjected Google’s TPU chips to radiation equivalent to five years in orbit, with no permanent failures recorded.

The next phase of the project includes a partnership with Planet Labs to launch two prototype satellites by early 2027. These prototypes will test TPU performance in actual orbital conditions, including exposure to radiation, microgravity, and thermal fluctuations. Findings from the mission will serve as the foundation for developing scalable orbital ML infrastructure capable of supporting distributed AI workloads.

However, several technical and economic challenges must be resolved before Project Suncatcher can move beyond its experimental stage. One primary issue is thermal dissipation in a vacuum environment. Economic feasibility is another critical factor. Wired reports that Google Research estimates that space-based ML systems would only be sustainable once the cost of placing payloads into LEO falls below US$200 per kilogram. Currently, launch costs for SpaceX’s Falcon 9 average between US$2,500 and US$3,000 per kilogram.

The preprint paper, published by Google Research, outlines the conceptual framework for satellite design, orbital control, and inter-satellite communication. By decentralizing computational infrastructure beyond Earth’s surface, Google’s approach could alleviate geographic and environmental constraints affecting terrestrial data centers, which face increasing limitations related to land use, power availability, and cooling capacity.

Industry analysts have noted that Project Suncatcher aligns with Google’s broader pattern of pursuing “moonshot” initiatives — high-risk, long-horizon projects such as quantum computing and autonomous systems. Through this initiative, Google Research aims to anticipate future computational bottlenecks and explore sustainable alternatives for large-scale AI processing.

The company intends to continue working with academic and industry partners to refine satellite hardware, increase optical transmission speeds to 10Tbps, and strengthen chip resilience to solar wind exposure. If successful, Project Suncatcher could mark a significant milestone in sustainable AI infrastructure.

Photo by:   Mexico Business News

You May Like

Most popular

Newsletter