When Nvidia CEO Jensen Huang took the stage at CES 2026, he made an announcement that seemed counterintuitive: the company's next-generation Rubin supercomputers would be cooled with hot water. Not chilled water. Not liquid nitrogen. Hot water—up to 60°C (140°F).
It sounds absurd. But in the high-stakes race to power artificial intelligence at scale, hot water cooling might be the breakthrough that saves the industry billions of dollars and prevents an impending energy crisis.
The Data Center Energy Crisis Is Here
Hyperscale AI data centers have become engineering marvels—and power-hungry monsters. The largest facilities being built today consume more than a gigawatt of electricity, enough to power entire cities. Inside, hundreds of thousands of specialized chips like Nvidia's H100 GPUs run continuously, training and running large language models at mind-bending scale.
The problem? These densely packed chips generate so much heat that traditional air conditioning can't keep up. The result: a desperate search for cooling solutions that won't bankrupt operators or melt the grid.
Tech giants are pouring hundreds of billions of dollars into this infrastructure. Amazon, Google, Microsoft, Meta, and OpenAI are in an arms race to build the most powerful AI compute clusters. But the impressive computing power comes at a staggering cost—over half of their electricity still comes from fossil fuels, while renewables meet just over a quarter of demand.
Liquid Cooling: The New Standard
Air cooling is dead. When chips are packed so densely that a single rack can draw 100 kilowatts or more, forced-air systems can't dissipate heat fast enough. The industry has turned to liquid cooling—mounting chips to cold water plates or dunking entire servers in baths of cooling fluid.
Liquid cooling isn't new. High-performance computing has used it for years. What's new is the scale—and the realization that colder isn't always better.
The Cold Water Trap
Traditional liquid cooling uses chilled water, often around 10-15°C (50-59°F). But maintaining water at that temperature requires energy-intensive chillers, which can account for 40% of a data center's total power consumption.
Here's where the physics gets interesting: the temperature of the cooling liquid doesn't matter as much as the delta—the difference between the liquid temperature and the chip temperature. A chip running at 80°C can be effectively cooled by water at 60°C, as long as the flow rate and heat exchange are optimized.
Nvidia's Rubin Platform: Hot Water, Cool Economics
Nvidia's Rubin platform, announced for second-half 2026 availability, is designed for 100% liquid cooling—using water at temperatures up to 60°C. This seemingly small shift has massive implications:
- Energy savings: Eliminating chillers can reduce cooling energy consumption by 40% or more
- Waste heat recovery: Hot water leaving the system can be used for district heating, office buildings, or industrial processes
- Simplified infrastructure: No chillers, no complex refrigeration loops, just pumps and heat exchangers
- Higher density: Better heat transfer enables even tighter chip packing
The result: data centers that are cheaper to build, cheaper to operate, and more environmentally sustainable—assuming they can source clean electricity in the first place.
Beyond Water: The Cooling Frontier
Hot water is just the beginning. Researchers and operators are experimenting with increasingly exotic cooling solutions:
Immersion Cooling
Some data centers are dunking entire servers in non-conductive fluids like 3M Novec or mineral oil. The fluid absorbs heat directly from components, then passes through heat exchangers to dissipate the thermal load. Benefits: silent operation, no fans, and even higher density than cold plates.
Seawater Cooling
A Portuguese startup is building a data center cooled by Atlantic Ocean water, piped directly from offshore. The concept eliminates the need for freshwater—a critical consideration as AI data centers compete with agriculture and residential use for scarce water resources.
Space-Based Data Centers
Google has published research on solar-powered AI compute clusters in orbit, where radiative cooling to the vacuum of space provides unlimited heat dissipation. The proposal faces enormous technical and economic hurdles—radiation hardening, data latency, launch costs—but signals just how desperate the industry is for sustainable solutions.
The Community Cost
While tech companies tout efficiency gains, the communities hosting these facilities are facing a darker reality:
- Soaring energy bills: Residential rates are climbing in regions with heavy data center concentration
- Water shortages: Cooling systems can evaporate millions of gallons annually
- Noise pollution: Industrial fans and cooling towers generate constant droning
- Air pollution: Diesel backup generators and fossil-fueled grid electricity degrade local air quality
A recent study found that electricity prices in Northern Virginia—home to the world's largest concentration of data centers—have increased 23% faster than the national average since 2020.
Nuclear: The Elephant in the Server Room
Faced with insatiable power demand and climate commitments, AI giants are turning to nuclear energy. Microsoft has signed deals to restart Three Mile Island. Amazon is funding small modular reactor (SMR) development. Google has announced partnerships with nuclear startups.
The logic is compelling: nuclear provides 24/7 baseload power with zero carbon emissions. But SMRs are unproven at scale, and regulatory approval timelines stretch into the 2030s. In the meantime, data centers will continue to lean heavily on natural gas peaker plants.
The AI Scaling Dilemma
Hot water cooling and nuclear power might buy the industry a few years. But the fundamental tension remains: AI scaling laws suggest that compute requirements will continue to grow exponentially. GPT-5, rumored to require 10x the compute of GPT-4, could need multi-gigawatt training runs.
At some point, the industry will hit physical limits—not of cooling technology or chip design, but of energy availability. Unless breakthroughs in algorithmic efficiency or model architecture reduce compute requirements, the AI race may be constrained not by innovation, but by kilowatts.
What Comes Next
Nvidia's Rubin platform is entering production. Competitors like AMD and Intel are racing to match hot water cooling capabilities. Hyperscale operators are redesigning facilities from the ground up for liquid cooling infrastructure.
The hot water era is here. But so is the reckoning: can the AI industry scale sustainably, or will the energy crisis force a slowdown?
For now, the bet is on engineering innovation. Hotter water. Smarter heat recovery. Nuclear baseload. And the hope that algorithmic efficiency can keep pace with exponentially growing ambitions.
The data center of the future won't just be a building full of servers. It will be a heat engine, a power plant, and a climate challenge—all in one.