Ok follow up question here. Is there cause to be concerned that releasing tons and tons of steam into the environment that was not there before will cause other environmental impact beyond just the reduced water supply? Like… If the ambient air is cooling all that water back into rain or something will that tangibly impact temperatures, or will average humidity change? Or is that part at least too small of an impact to be particularly material?
Comment on What's the deal with AI datacenters using water for cooling?
litchralee@sh.itjust.works 1 day ago
Other commenters correctly describe the cost analysis for using evaporative cooling, but I’ll add one more reason why it’s the preferred method when water is available: evaporating water can dissipate truly outlandish amounts of heat with very few moving parts.
Harkening back to high school physics class, water – like all other substances – has a certain thermal capacity, meaning the energy needed to increase the temperature of 1 kg of water by 1 degree C. The specific thermal capacity of water is already quite high, at 4184 J/(kg*C), besting all the common metals and only losing to lithium, hydrogen, and ammonia. In nature, this means that large bodies of water are natural moderators of temperature, because water can absorb an entire day’s worth of sunlight energy but not substantially change the water temperature.
But where water really trounces the competition is its “heat of vaporization”. This is the extra energy needed for liquid water to become vapor; simply bringing water to 100 C is not sufficient to make it airborne. Water has a value of 2146 kJ/kg. Simplifying to where 1 kg of water is 1 liter of water, we can convert this unit into something more familiar: 0.596 kWh/L.
What these two physical properties of water tell us is that if our city water comes out of the pipe at 20 C, then to get it to 100 C to boil, we need the difference (80) times the thermal capacity (4184 J/kg*C), which is 334,720 J/kg . Using the same simplification from earlier, that comes out to be 0.093 kWh/L. And then to actual make the boiling liquid become a vapor (so that it’ll float away), we then need 0.596 kWh/L on top of that.
Let that sink in for a moment: the energy to turn water into vapor (0.596 kWh/L) is six times higher than the energy (0.093 kWh/L) to raise liquid water from 20 C to 100 C. That’s truly incredible, for a non-toxic, life-compatible substance that we can (but should we?) safely dump into the environment. If you total the two values, one liter of water can dissipate 0.69 kWh of energy per liter. Nice!
In the context of a 100 megawatt data center (which apparently is what the industry considers as the smallest “hyperscale data center”), if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9 seconds. Not nice!
And AI datacenters are only getting larger, with some reaching into the low single-digits of gigawatts. But what is the alternative to cooling the more-modest data center from earlier? The reality is that the universe only provides for three forms of heat transfer: conduction, convection, and radiation. The heat from data centers cannot be concentrated into a laser and radiated into space, and we don’t have some sort of underground granite mountain that the data centers can conduct their heat into. Convection is precisely the idea of storing the heat into a substance (eg water, air) and then jettisoning the substance.
So if we don’t want to use water, then we have to use air. But for the two qualities of water that make it an excellent substance for evaporative cooling, air doesn’t come close – 1003 J/(kg*C) and no heat of vaporization, because air is already gaseous. That means we need to move ungodly amounts of air to dissipate 100 megawatts. But humanity has already invented the means to do this, by a clever structure that naturally encourages air to flow through it.
The only caveat is that the clever structure is a cooling tower, and is characteristic of nuclear power stations. It’s also used for non-nuclear power station cooling, but it’s most famous in the nuclear context, where generators are well into the gigawatt range. Should AI datacenters use nuclear-sized air cooling towers instead of water evaporation? It would work, but even as someone that’s not anti-nuclear, the optics of raising a cooling tower in rural America just to cool a datacenter would be untenable.
danciestlobster@lemmy.zip 1 day ago
drmoose@lemmy.world 11 hours ago
The atmosphere is very very very big and water is very very very normal thing to have there.
litchralee@sh.itjust.works 1 day ago
There is almost certainly an impact somewhere, but I don’t have the data to know where it is. My conjecture is that a localized mass of steam would cause convection currents and drive microweather phenomena, especially downwind of such an air cooled facility. I’m not sure rain is necessarily the result, unless there’s a sizable mountain downwind, since although hot air will rise, it might run out of steam (pun intended) before cooling down enough to fully condense out. So it might just be adding a layer of humidity that floats a few hundred meters above the surface.
But even that could be devastating, if said layer blocks natural convection currents over a downwind town or city. It could act as a thermal cap, making that town warmer at night, because heat rising from the city would meet that humid layer and get absorbed by the water. The thermal capacity of water comes into play again, but this time against the city.
iocase@lemmy.zip 23 hours ago
Not meaningfully, no. In the middle of a dry desert far from other bodies of water you could theoretically form cumulus clouds downwind of your site (I have heard of this happening), but it would be teeny tiny.
The amount of water evaporation is just orders of magnitude too small. The earth gets about 1kW of energy per square meter, so a 9GW data center is approximately the same amount of waste heat as 9 million square meters, which is 900 hectares.
BlackLaZoR@lemmy.world 1 day ago
I doubt that it has meaningful impact on climate. Evaporation from plants and oceans is many orders of magnitude greater. The issue is pretty always about fresh water availability in the given region.
SorteKanin@feddit.dk 1 day ago
So is air cooling actually feasible but we don’t do it cause it would make data centers look like nuclear reactors? Or is it just not feasible?
BlackLaZoR@lemmy.world 1 day ago
AFAIK it’s feasible for most data centers except the where power density is so huge that you just can’t do it with air cooling. That issue is most common for large scale AI data centers.
Modern CPU consumes ~150W, modern AI chip can eat 700W and they’re packed as densely as possible with multiple cards slotted in every motherboard.
litchralee@sh.itjust.works 1 day ago
Air cooling is feasible, as evidenced by existing power stations that use air cooling. A lot of newer nuclear generation use water cooling, being sited along the ocean and in the multi gigawatt range. But we can also find examples of inland power stations that have no water connection, and therefore need some massive cooling towers. Here is one in Germany that has a 2.2 GW rating and a 200 meter tall tower: en.wikipedia.org/wiki/Niederaussem_Power_Station
This is, as you can imagine, rather expensive to build, but it’s doable. Cooling a coal fire is not substantially different than cooling compute loads in a data center, as it’s all just a matter of moving heat around. Will there be differences due to the base temperature of coal versus GPUs? Yes, since the ratio of input to ambient temperature matters. But on the flip side, this should make it easier to construct, as the plumbing for lower temperatures is simpler.
Mechanical engineers can chime in on feasibility for AI data centers, but seeing as it hasn’t been done, its probably still cost related.
blarghly@lemmy.world 1 day ago
Followup: what are the impediments to using, say, seawater instead?
SpacetimeMachine@lemmy.world 1 day ago
Salt water is a huge pain to work with. The salt would quickly corrode any cooling systems.
morbidcactus@lemmy.ca 1 day ago
And even for fresh water, you have biofouling to worry about and what to do with the water after you’ve used it, can’t just dump it into the environment untreated.
HobbitFoot@thelemmy.club 1 day ago
There are already heat exchanging systems that do this with brackish water already; you don’t need to treat water if all you ate doing to the water is making the water hotter or colder.
BussyCat@lemmy.world 23 hours ago
People mentioned corrosion which is true of all sea water systems but in evaporative systems you also have the addition of salt forming on all the evaporative surfaces which can drastically increase corrosion more than normal seawater and cause fouling
So to do this properly you would want an RO system making freshwater before the cooler which at that point it would make more sense to just have a separate company doing desalination.
litchralee@sh.itjust.works 1 day ago
Very similar problems arise with desalination plants, which I wrote about here: sh.itjust.works/comment/14613302
degenerate_neutron_matter@fedia.io 1 day ago
You mean 6.9 hours? You're definitely off by a few orders of magnitude there.
litchralee@sh.itjust.works 1 day ago
Darn, you’re right, the hours fell off in my dimensional analysis. Corrected, although 6.9 hours for a pool isn’t much time for swimming at all.