§ Insight 03 · Data Centers · National

Why data center construction is a thermal problem first

Goldman Sachs Research projects U.S. data center power demand will increase roughly 165% by 2030 compared to 2023 — the largest short-horizon increase in the history of the American electric grid. Journalists describe this as a real-estate boom. That framing is incomplete. The decisive engineering problem in a modern data center is not the square footage. It is the heat.

A well-provisioned AI training hall consumes 30 to 100 kilowatts per rack, depending on generation. A single row of high-density racks can produce more waste heat than an entire office building. And every watt of electrical consumption becomes, almost to the joule, a watt of heat that must be removed from the building — 24 hours a day, 365 days a year, without interruption.

That constraint reshapes the structure itself. Not decoratively — fundamentally.

Why air cooling is running out of room

Traditional data center cooling relies on computer room air handlers that push cold air up through a raised floor, across server inlets, and back into a return plenum. This works reliably up to roughly 20 to 30 kilowatts per rack, beyond which the air side becomes the bottleneck. Air has low heat capacity — you need a lot of it, moving fast, to carry modest amounts of heat.

GPU-heavy AI workloads have already crossed that line. NVIDIA H100, H200 and B200 systems in dense configurations routinely push past 50 kilowatts per rack and are trending toward 100+. At those densities, no practical volume of moving air can keep up.

Liquid cooling changes the building

Liquid has roughly 1,000 times the volumetric heat capacity of air. That is why the industry is converging on two liquid-based approaches:

  • Direct-to-chip cooling: cold plates mounted on GPUs and CPUs, with manifolds feeding facility-level cooling distribution units (CDUs) that exchange heat with a building chilled-water loop.
  • Immersion cooling: entire servers submerged in dielectric fluid, with the fluid flowing through heat exchangers to reject heat to the building side.

Both approaches relocate thousands of gallons of cooling fluid into the server hall. That has four immediate construction consequences:

1. Structural loading

Liquid-cooled racks are heavier than air-cooled racks, and immersion tanks can approach one thousand pounds per linear foot of floor. Floor-slab structural design, often delegated to a late-stage owner’s engineer in air-cooled facilities, becomes a primary design driver. Post-tensioned slabs, thicker topping slabs or structural steel platform assemblies all become live decisions at concept stage.

2. Leak containment

Dielectric fluid is expensive and loss of coolant is catastrophic for the servers it is cooling. Containment curbs, leak-detection cable trays and drain-to-sump designs are code-adjacent infrastructure that now belongs in the architectural drawings, not only the mechanical drawings.

3. Piping and routing

Running a chilled-water loop through an active server hall is non-trivial. Welded stainless-steel loops, bolt-assembly test protocols and coordinated penetrations through rated partitions must be designed before the slab is poured. Retrofit of existing air-cooled halls to liquid is possible but expensive, precisely because the building was not sized for it.

4. Thermal insulation and envelope

Every BTU not coming in from the outside is a BTU you don’t have to pay to remove. High-performance wall assemblies, vapor-tight membranes, thermally broken glazing at service entries and light-colored roof assemblies move directly into the core construction scope, not the aesthetic one.

The short version: an AI-era data center is closer to a cryogenic facility or a steelmaking cooling tower than to a conventional office building. The civil and structural work follows from the thermal requirement, not the other way around.

Waterside economizers and the grid

Owners who design these buildings well pay half as much for cooling as owners who do not. The difference is dominated by waterside economizer hours: time during which outside air is cool enough to reject heat directly to a cooling tower without running a compressor. Northern climates win this variable easily. Florida and the Gulf Coast, notably, do not.

That matters because U.S. data center siting is being reshaped around electricity and water availability. The Electric Reliability Council of Texas (ERCOT) and PJM have both issued public warnings about data center interconnection queues that exceed current generation capacity. Water rights in the desert Southwest are becoming a constraint on new construction. The hyperscalers already know this; the regional developers are learning.

What this means for American manufacturing

There is a national-policy dimension to all of this. The CHIPS Act and follow-on incentives are pushing semiconductor manufacturing back to the United States; the AI buildout is creating an almost uncapped demand for facilities that house that silicon; and the executed projects require the full stack of U.S. industrial construction — from slabs and structural steel to high-purity mechanical, from fire-suppression to UPS and standby generation.

This is squarely adjacent to the steel-mill and cooling-tower work I have done in Brazil. Cooling a blast furnace and cooling a data hall are not the same task, but they are the same discipline: moving enormous quantities of heat with precision, predictability and no downtime.

If the next decade of American construction has a defining project type, it will not be the office tower or the single-family home. It will be the thermal facility — data center, chip fab, power plant — and the engineers who understand heat as a first-class structural input will be the ones building it.

References

  1. Goldman Sachs Research, AI, Data Centers and the Coming US Power Demand Surge, April 2024.
  2. International Energy Agency, Electricity 2024 report — data center and AI electricity demand scenarios.
  3. Uptime Institute, Global Data Center Survey 2024.
  4. American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE), Thermal Guidelines for Data Processing Environments, 5th ed.
← Back to all insights

Building a thermally intense facility?

We provide civil, structural and thermal coordination for data centers, industrial cooling systems and manufacturing retrofits across the United States.

Let’s talk →