Wildfires, troop movements and other fast-changing events on Earth are monitored by a small army of satellites watching the planet day by day, moment by moment. Analysing the output of this floating forest of cameras and sensors takes time, however, in large part thanks to the difficulty in transmitting the resulting terabytes of data back to Earth – a delay that, conceivably, could cost lives.

To solve this problem, several larger satellites are launched with onboard computing power and data storage, allowing for the output of the cameras to be analysed in space and the results sent to Earth. However, this adds to the cost of launch and isn’t viable for smaller satellites.

But what if satellites never needed to send their data to ground stations in the first place? That’s the question being pondered by a host of start-ups and multinationals, as they plan to launch data centres themselves into orbit.

Processing data in space will reduce the need to beam it back to earth. (Photo by 3DSculptor/iStock)

By processing and storing near-Earth observations in space-based clouds, governments and private companies on the ground can not only receive clean observation data more quickly, but count on a data storage solution immune from the risk of flood, fire and earthquake that its equivalent would experience planet-side. 

Indeed, there are already some powerful computers in space equipped to do this. The International Space Station, for example, uses its HPE supercomputer to run machine learning models capable of swiftly processing images of astronaut gloves or run experimental data, rather than sending gigabytes of data back to Earth – a process that can take weeks or months due to limited bandwidth.

While there are some clear advantages in terms of speed, there are also risks involved with putting data and processing power in orbit. A review by Capitol Technology University in 2018 outlined several exotic dangers to satellite missions, including geomagnetic storms crippling equipment, space dust turning to hot plasma as it hits the spacecraft, and even collisions with other objects in a similar orbit.

Despite these risks, demand from satellite operators for more efficient data processing solutions has seen several companies forge ahead with plans for orbital data centres. This is, in part, because so many recent launches have been for ‘smallsats’, machines weighing under 1,200kg that have no room aboard for data processing and storage.

As such, a new sector is slowly evolving to service these devices and their larger cousins. This emerging industry anticipates a fleet of orbital data centres zipping silently around the Earth within the next two decades.

The exact form they will take will vary, says Rick Ward, an expert in data processing in space and CTO of OrbitsEdge, a company looking to launch a fleet of space computers. “Some will be large devices sitting in geostationary orbit able to hold petabytes of data, whereas others will be in low-Earth orbit (LEO) with more powerful computers processing data from nearby satellites,” he predicts.

How would a space data centre work?

Man could never have broken orbit without data storage – although he didn’t need much, compared to our present computing standards. The guidance computer on board Apollo 11, for example, only needed 4KB of RAM and a 32 KB hard disk to land Neil Armstrong and Buzz Aldrin onto the lunar surface. An Apple Watch Series 7, by comparison, has 1GB of RAM and 32GB of storage.

That’s not to say that space-bound computers haven’t caught up with their terrestrial cousins: the ISS’s supercomputer, for example, can operate in harsh environments and perform edge computing at teraflop speeds. 

Companies such as OrbitsEdge, however, predict that the future of space computing is less likely to focus on raw computing power than on distributed storage. The reasons for this, explains Ward, are becoming increasingly obvious to those managing data centres on Earth. 

“Ask Amazon to show you their power bill and you will see the cost of storing data on Earth,” he says. “It isn’t just storage either, but data handling and processing as well. The biggest expense, beyond the cost of land in city-centre locations, is electricity. For orbital data centres that can come straight from the Sun through direct solar power.” 

Ongoing costs, explains Ward, gradually eat into a company’s profit margins. The main cost concern for space-based data centres, meanwhile, comes in the upfront investment - in other words, paying for the rocket that launches the satellite in the first place. After that, companies should expect smooth sailing from their orbital assets, at least in cost terms. 

One proposal from the Florida-based firm would see a small number of data centres launched into geostationary orbit. High above the Earth, these larger satellites would receive and store petabytes of data from larger constellations at lower orbits, many of which would act as processing hubs capable of relaying data at low latencies back to ground stations. It will be a distributed data centre, with processing and storage across multiple devices, although Ward prefers to call it a “single megastructure.” 

Data flows of this complexity and scale are already in place between satellites and ground stations, explains Andrii Muzychenko, EOS SAT Project Manager at EOS Data Analytics. “Middle satellites with higher transmission rates can send data to several ground stations and reach 10-50 TB per day with 2-3x compression,” he says. Heavier satellites, meanwhile, “can take images and directly transfer hundreds of terabytes with 2-3x compression through telecommunications satellites”.

It’s therefore easy to imagine a similar framework being applied between data centres in geostationary orbit, LEO observation satellites and ground stations. “I see it as an iterative process where you first build one, then build 100 or 1,000 and so on, until you have an ever-growing amount of capacity to service a growing sector,” he says.

One key function Ward anticipates outsourcing to this array is change analysis. Having AI systems process subtle hourly changes in terrestrial observation data on satellites would lead to vast efficiencies in how we use such information to monitor the destructive effects of climate change, among other events. 

Japanese telecom giant NTT is also working on designing orbital data centres, the first of which is due to launch by early 2025. Its plans are more scaled-back than OrbitsEdge, in that single satellites will be tasked with not only storing but processing data - significantly speeding up the time in which they could communicate with ground stations in an emergency. NTT has also said that its data centres will be powered by photonic chips that allow for lower power consumption and a greater ability to resist solar radiation.

Who will use a space data centre?

The logic behind orbital data centres, explains Ward, is irresistible. Over the course of a generation, “we will see data centres moving normal operations to space,” he says, motivated in large part by a calculus that maintaining these hubs in a vacuum is much more affordable than paying for power and rental costs here on Earth. Some of the first tempted beyond the stratosphere, he adds, are likely to be those companies managing such facilities in notoriously expensive locations like New York and London.

Still, concerns remain about the practicality of such an operation. Upfront costs, for example, remain a significant issue. Right now, the most affordable options entail paying $2,000 per kilogram launched, a pricey proposition for orbital facilities likely to weigh several tonnes. Those costs are expected to fall significantly, however, once SpaceX, Elon Musk's space enterprise, expands its launch capacity by debuting its Starship launch vehicle in 2023. 

Physical risks to orbital data centres must also be considered. While space is devoid of earthquakes and atmospheric phenomena, satellites are always in danger of being struck by micrometeorites, engulfed by geomagnetic storms, or destroyed in collisions with other orbital assets. Nation-states are also waking up to this reality.

The UK government, for example, recently announced new regulations designed to mitigate against space debris, including new requirements on de-orbiting satellites as they reach the end of their lives and ensuring they carry enough on-board fuel to conduct emergency manoeuvres to avoid collisions. 

More of these initiatives should be expected from nation-states as LEO gets more crowded. Over the next few years, SpaceX alone plans to launch 13,000 internet satellites, while Amazon hopes to send more than 3,000 into orbit as part of its Kuiper internet service.

Governments from the EU to China, meanwhile, are also considering mega-constellations of satellites. As such, there is a heightened danger of frequency clashing and signal degradation as all of these satellites fight to be heard by ground stations - potentially ruling out orbital data centres before they’ve even been launched. 

Ward himself concedes that these risks will need to be tackled before any data centre megastructures get launched into orbit. “We have to send test devices to space first,” he says. NTT, meanwhile, doesn’t expect to have an operational data centre in space before 2026.

As such, while space represents a tempting prospect for data centre operators afflicted with rising rental and energy costs, it may be several years yet before their dreams of orbital arrays get off the ground.

Read more: Reach for the stars: The UK’s post-Brexit race to space