As AI capabilities expand, so does an uncomfortable truth: the infrastructure powering artificial intelligence is approaching fundamental limits. Training frontier models already consumes gigawatt-hours of electricity, with next-generation systems projected to require 10-100× more energy. Data centers now account for 1-2% of global electricity consumption, and cooling alone represents 30-40% of operational costs.
At Bankline, an AI research company dedicated to advancing the foundations of artificial intelligence, we've spent considerable time thinking about what responsible scaling looks like. Not just in terms of model architectures or training efficiency, but in terms of the physical substrate that makes AI possible at all.
This led us to an unconventional question: what if we've been building AI infrastructure in fundamentally the wrong places?
Space vs. Ocean: A Thermodynamic Reality Check
Recent proposals for space-based data centers capture the imagination—unlimited solar power, no land use conflicts, dramatic symbolism. But when our team of experts from thermodynamics, ocean engineering, and AI systems architecture examined the physics, we encountered an insurmountable constraint.
The Stefan-Boltzmann law governing heat radiation in vacuum reveals a stark limitation: radiative cooling in space is approximately 3,000 times less efficient than convective water cooling. For a 100MW hyperscale AI training facility, you'd need radiator arrays the size of multiple football fields. The mass and deployment complexity make this approach economically and physically prohibitive.
Water, by contrast, provides convective heat transfer coefficients of 1,000-15,000 W/m²K compared to space's ~5 W/m²K effective radiative transfer. This isn't a marginal difference—it's a different category of solution entirely.
As our technical analysis concluded: "Space is for connectivity. The ocean is for compute."
Project CELSIUS: A First Step Toward Sustainable AI Infrastructure
Recognizing that AI companies have a responsibility to help solve—not exacerbate—global energy challenges, Bankline has taken the initiative to develop Project CELSIUS (Compute Engine Leveraging Sustainable Immersive Underwater Systems) as a comprehensive research framework for deep-ocean AI infrastructure.
This wasn't a solitary effort. We assembled a multidisciplinary team spanning:
- Ocean thermal engineering experts who understood OTEC (Ocean Thermal Energy Conversion) systems
- Marine structural engineers experienced with cyclone-resistant offshore platforms
- AI systems architects who knew the real-world demands of training infrastructure
- Environmental scientists focused on protecting marine ecosystems
- Economic modelers who could assess long-term viability
Critically, we used AI itself throughout this research—employing advanced models to explore design spaces, run thousands of thermodynamic simulations, identify failure modes we might have missed, and challenge our own assumptions through adversarial analysis.
The Core Insight: Turn Cooling from Cost to Resource
Traditional data centers treat cooling as a necessary evil—a massive operational expense that consumes water, energy, and capital. Project CELSIUS inverts this relationship entirely.
By locating AI compute infrastructure in the Gulf of Mannar, India, we can exploit a natural 24°C thermal gradient between warm surface waters (29°C) and cold deep water (5°C) at 1,000m depth. This enables:
Ocean Thermal Energy Conversion (OTEC): The thermal gradient drives a power cycle that generates electricity with zero fuel cost. While the 3.2% thermal efficiency appears modest, the comparison isn't with fossil fuel plants—it's with data centers that pay continuously for grid power.
Direct seawater cooling: The 5°C deep water becomes an infinite heat sink, achieving a Power Usage Effectiveness (PUE) of 1.05 compared to 1.4-1.6 for terrestrial facilities and 2.1+ for theoretical orbital systems.
Water production instead of consumption: Conventional data centers consume 1.8 liters per kWh. Our hybrid-cycle OTEC system produces 5.5 million liters per day of freshwater as a byproduct—sufficient to supply coastal communities while training AI models.
India's Bathymetric Advantage
Geography matters. India possesses something unique among major economies: the steepest continental shelf gradient, enabling access to 1,000m depth within just 30km of shore.
This proximity to deep cold water, combined with year-round warm surface temperatures, positions India to offer AI training services at dramatically lower environmental and economic cost than facilities in the US, Europe, or elsewhere. The strategic implications extend beyond any single company—this could establish India as a global hub for AI infrastructure.
Confronting the Hard Problems
What distinguishes serious research from speculation is honest engagement with failure modes. Our red team analysis identified four critical vulnerabilities:
The razor-thin energy margin: Operating at 3.2% efficiency leaves minimal room for error. Our solution: ammonia backup turbines providing true black-start capability, using the existing OTEC working fluid rather than adding cryogenic hydrogen storage.
Maintenance in a moving ocean: Repairing servers underwater during storms isn't feasible. Our solution: N+20% redundant capacity and a moonpool cartridge system that only performs physical swaps during calm weather windows.
Vortex-induced vibration in the 1,000m cold water pipe: A rigid pipe acts like a guitar string vulnerable to resonant failure. Our solution: segmented "vertebrae" design with flexible couplings that prevent system-wide resonance.
Operating in a UNESCO Biosphere Reserve: Technical solutions alone can't overcome regulatory opposition. Our solution: position as a desalination plant that trains AI as a byproduct, with rigorous adherence to marine life protection standards.
Environmental Responsibility as Design Constraint
We took seriously the fact that the Gulf of Mannar is protected dugong habitat. This meant:
- Velocity cap intakes that trigger fish escape reflexes (complying with US Clean Water Act Section 316(b) standards)
- Deep discharge at 70m to prevent thermal impacts on coral reefs
- Multi-layer biofouling prevention using proven industrial technologies
- Third-party environmental monitoring with public transparency
- 7.56 million tonnes of CO2 avoided over the 30-year design life
The water we produce isn't primarily a profit center—it's what we call the "freshwater bribe," offered to local communities at subsidized rates to build genuine social license for the project.
What This Means for AI Development
Project CELSIUS targets an 8,000 GPU facility with 31.7 exaFLOPS of aggregate compute—capable of training a GPT-4 scale model in 114 days. The levelized cost of energy would be $0.043/kWh, roughly 50% below grid-connected alternatives.
But the significance extends beyond any single training run. If ocean-based infrastructure proves viable, it opens a scaling pathway that isn't constrained by:
- Grid capacity bottlenecks
- Freshwater scarcity
- Land use conflicts
- Carbon intensity of regional electricity
- Geopolitical concentration of AI infrastructure
The Path Forward
We're releasing this technical framework publicly because solving AI's infrastructure challenges requires collaboration across government, industry, academia, and civil society. Bankline is an AI research company, not an ocean engineering firm—we developed CELSIUS to demonstrate feasibility and catalyze action, not to monopolize the approach.
The roadmap we've outlined moves from a 1MW pilot to a 50MW commercial facility to eventually a 500MW+ network. Each phase provides validation points and off-ramps if fundamental assumptions prove wrong.
We expect—and welcome—vigorous technical criticism. The peer review process for this work should be as rigorous as for any AI safety research, because the environmental and social stakes are comparable.
A Different Kind of Moonshot
There's a tendency in technology to conflate ambition with altitude—to assume the hardest problems require leaving Earth. Project CELSIUS suggests otherwise.
The ocean covers 71% of our planet and contains thermal resources orders of magnitude larger than human energy consumption. The engineering challenges are formidable: thousand-meter pipes, cyclone-resistant platforms, marine life protection, international waters governance. But unlike the thermodynamics of space-based cooling, none of these challenges violate physical law.
At Bankline, we believe AI research companies have a responsibility to think beyond the next model release toward the infrastructure that makes continued progress possible at all. Project CELSIUS represents our first step in that direction—developed through collaboration with experts across disciplines and with AI as a research partner throughout the process.
The future of intelligence may indeed be underwater. But it will only get built if we're willing to do the hard, unglamorous work of rigorous engineering in service of genuine sustainability.
