top of page
Search

#92 - Underwater Data Centres



The Rising Cost of Storing the World’s Data


I found myself curious this week about underwater data centres, mostly because I kept seeing references to cooling challenges in modern computing. A quick dive into the numbers made the topic feel important:


  • Data centres used roughly 1.5% of global electricity in 2024 (IEA).

  • Demand is expected to more than double by 2030, driven by AI and cloud services.

  • Cooling remains one of the biggest operational challenges, especially as AI hardware generates far higher heat loads.

  • Some facilities still struggle to dissipate waste heat efficiently, particularly in dense urban areas (Reuters).


What surprised me is how tight the margins are becoming. We often talk about chips and models, but rarely about the physical limits, energy, heat, and space, that underpin the whole system.


That curiosity pulled me into the world of sub sea data centres this week: a practical idea that sidesteps many of these constraints by using the ocean’s natural thermal stability.


The Problem: Heat, Energy, and Physical Limits


Once you look at the numbers, it’s clear the current model for data centres is running into hard physical limits.


  • Electricity demand: Data centres used around 415 TWh of electricity in 2024, about 1.5% of global consumption, and their demand has been growing at roughly 12% per year since 2017. 


  • Projected growth: The IEA projects this could more than double to ~945 TWh by 2030, roughly 3% of global electricity use, with AI the main driver. 


  • Power density spike: Traditional CPU racks sat around 5–15 kW per rack, but AI GPU racks are now hitting 20–40 kW, and average densities are expected to rise towards 30 kW per rack by 2027. 


  • Cooling burden: Managing that heat is expensive, in some facilities, up to 40% of total energy use goes into cooling systems rather than compute. 


  • Grid and siting constraints: The IEA notes that around 20% of planned data-centre projects are already facing delays due to grid strain, while reports from Europe and North America point to slowed rollout and rising outage risks as power networks struggle to keep up.


So we’re not just dealing with “big buildings that use a lot of power”, we’re dealing with rapidly rising electricity demand, sharply higher heat density per rack, and grids that are starting to push back. That’s the backdrop against which subsea data centres start to look less like sci-fi and more like a serious engineering option.


Solution: Underwater Data Centres


Subsea data centres are essentially sealed, cylindrical server pods placed on the seafloor or anchored just below the surface. They’re filled with nitrogen to prevent corrosion, linked to shore with fibre for data and power, and cooled passively by the surrounding seawater. Once deployed, they can operate autonomously for years with minimal maintenance, which is what makes them so interesting as an alternative to land-based builds.


By using the ocean’s natural conditions, they address many of the core constraints facing conventional data infrastructure. The evidence is surprisingly strong:


  • Natural cooling drastically reduces energy overhead — The seawater environment acts as a “massive, free heat sink,” allowing servers to dissipate heat without energy-intensive chillers or evaporative cooling.


  • Better energy efficiency / near-optimal PUE — Microsoft’s Project Natick modules achieved a Power Usage Effectiveness close to 1.07, far better than the industry average for land-based centres.


  • Improved hardware reliability — Natick’s two-year deployment showed underwater servers were up to 8× more reliable than equivalent land-based servers thanks to stable temperatures, no oxygen, and fewer human interventions.


  • Reduced land and grid constraints — Subsea centres avoid issues around land acquisition, zoning, and local grid saturation. And with around 40% of the global population living within 100 km of a coast, offshore sites can offer lower-latency connections.


  • Minimal water consumption — Unlike many land data centres that rely heavily on fresh-water-intensive cooling, subsea systems use the ocean’s ambient temperatures directly, eliminating onshore water demand.


  • Synergy with offshore renewables — Emerging designs link subsea modules to offshore wind, wave, or tidal power, enabling genuinely low-carbon, distributed compute clusters.


In short: subsea data centres convert the ocean’s physical advantages, stable temperatures, abundant space, and coastal proximity, into a tangible engineering solution for the heat, energy, and grid pressures now limiting on-land data-centre growth.


Conclusion: A Pragmatic Look at What Comes Next


Subsea data centres aren’t a silver bullet, and it’s important to acknowledge the limitations. Retrieval and maintenance are difficult and expensive; any hardware refresh requires lifting an entire module back to the surface.


Deployments also depend on reliable subsea cabling, stable seabed conditions and thorough environmental assessments. And although early trials reported minimal ecological impact, scaling will require continued, transparent monitoring to keep that confidence.


A few of the key drawbacks include:


  • Maintenance complexity: entire pods must be resurfaced for repairs or upgrades.

  • Deployment cost: offshore operations and vessels raise capex.

  • Environmental scrutiny: even low-impact systems will face regulatory hurdles.

  • Operational rigidity: a sealed pod leaves little room for rapid reconfiguration.


Yet the broader context matters. As compute demand accelerates and on-land facilities hit limits around heat density, grid capacity and land availability, using naturally cold and thermally stable seawater starts to feel less like a moonshot and more like a practical alternative.


In reality, subsea modules won’t replace hyperscale campuses, but they could become one of several viable pathways, especially for coastal edge compute, AI inference workloads, or regions where cooling is the dominant constraint.


If cooling and electricity availability continue to shape the future of AI infrastructure, then the ocean, quiet, stable and already threaded with fibre, may play a far larger role than most people expect.


Stay tuned!

H


 
 
 

Comments


bottom of page