Is there any reason the water can’t be safely consumed later? It’s not toxic or nuclear is it? The cooling water didn’t just up and disappear did it?
Other commenters correctly describe the cost analysis for using evaporative cooling, but I’ll add one more reason why it’s the preferred method when water is available: evaporating water can dissipate truly outlandish amounts of heat with very few moving parts.
Harkening back to high school physics class, water – like all other substances – has a certain thermal capacity, meaning the energy needed to increase the temperature of 1 kg of water by 1 degree C. The specific thermal capacity of water is already quite high, at 4184 J/(kg*C), besting all the common metals and only losing to lithium, hydrogen, and ammonia. In nature, this means that large bodies of water are natural moderators of temperature, because water can absorb an entire day’s worth of sunlight energy but not substantially change the water temperature.
But where water really trounces the competition is its “heat of vaporization”. This is the extra energy needed for liquid water to become vapor; simply bringing water to 100 C is not sufficient to make it airborne. Water has a value of 2146 kJ/kg. Simplifying to where 1 kg of water is 1 liter of water, we can convert this unit into something more familiar: 0.596 kWh/L.
What these two physical properties of water tell us is that if our city water comes out of the pipe at 20 C, then to get it to 100 C to boil, we need the difference (80) times the thermal capacity (4184 J/kg*C), which is 334,720 J/kg . Using the same simplification from earlier, that comes out to be 0.093 kWh/L. And then to actual make the boiling liquid become a vapor (so that it’ll float away), we then need 0.596 kWh/L on top of that.
Let that sink in for a moment: the energy to turn water into vapor (0.596 kWh/L) is six times higher than the energy (0.093 kWh/L) to raise liquid water from 20 C to 100 C. That’s truly incredible, for a non-toxic, life-compatible substance that we can (but should we?) safely dump into the environment. If you total the two values, one liter of water can dissipate 0.69 kWh of energy per liter. Nice!
In the context of a 100 megawatt data center (which apparently is what the industry considers as the smallest “hyperscale data center”), if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9 seconds. Not nice!
And AI datacenters are only getting larger, with some reaching into the low single-digits of gigawatts. But what is the alternative to cooling the more-modest data center from earlier? The reality is that the universe only provides for three forms of heat transfer: conduction, convection, and radiation. The heat from data centers cannot be concentrated into a laser and radiated into space, and we don’t have some sort of underground granite mountain that the data centers can conduct their heat into. Convection is precisely the idea of storing the heat into a substance (eg water, air) and then jettisoning the substance.
So if we don’t want to use water, then we have to use air. But for the two qualities of water that make it an excellent substance for evaporative cooling, air doesn’t come close – 1003 J/(kg*C) and no heat of vaporization, because air is already gaseous. That means we need to move ungodly amounts of air to dissipate 100 megawatts. But humanity has already invented the means to do this, by a clever structure that naturally encourages air to flow through it.
The only caveat is that the clever structure is a cooling tower, and is characteristic of nuclear power stations. It’s also used for non-nuclear power station cooling, but it’s most famous in the nuclear context, where generators are well into the gigawatt range. Should AI datacenters use nuclear-sized air cooling towers instead of water evaporation? It would work, but even as someone that’s not anti-nuclear, the optics of raising a cooling tower in rural America just to cool a datacenter would be untenable.
Wxfisch@lemmy.world 1 day ago
It evaporates, that’s how it cools. The water is sprayed over a heat exchanger and gets turned to essentially steam and then new water is pumped in and thus the water is “gone”. It will fall as rain somewhere but likely not near where it was taken from.
A closed loop system could be used but they are more expensive and require more maintenance so large data centers don’t usually use them unless required to.
over_clox@lemmy.world 1 day ago
I am still learning. Thank you for your educational comment.
I loathe AI anyways, I just wanna better understand why I loathe AI…
qupada@fedia.io 1 day ago
Further to this, as well as the source of the water often being the local city's drinking water supply (as we've found this puts a strain on that supply), evaporative cooling systems concentrate the minerals / contaminants in the water, meaning a smaller (relative to what is evaporated) of now highly-concentrated runoff water also has to be constantly disposed of. This likely is also going into the city's wastewater systems.
Radiators for closed-loop systems do also occupy more space (for the same cooling capacity) versus evaporative cooling towers, and are more limited in the range of climates they can be deployed in.
On balance though, the closed-loop cooling should always be the first choice; if it works for the deployment it will never be the wrong choice on a long-term / total cost of ownership basis.
ace_garp@lemmy.world 1 day ago
This was the most comprehensive list I’ve seen about the negative aspects of LLM/datacentres use. Very informative.
lemmy.cafe/comment/16350045
troybot@piefed.social 1 day ago
Ok so what you’re telling me is power plants generate electricity by burning fossil fuels which power a turbine with steam, then the data center uses all that electricity to produce even more steam?
Railcar8095@lemmy.world 1 day ago
And all that is to… Produce steamy furry porn
UniversalBasicJustice@quokk.au 1 day ago
I have had news for you; it’s all steam. EVERYTHING is steam. 🌍🧑🚀🔫🧑🚀🌚
Even you and I are just steam in liquid and solid phases.
SaveTheTuaHawk@lemmy.ca 22 hours ago
Every car in the world uses a closed loop cooling system that does not consume water.
BussyCat@lemmy.world 19 hours ago
Car cooling systems are stupidly expensive, run at temps that would damage computer CPUs, run outside, and have a really nice advantage over computers which is that at higher heat loads they also tend to go faster thus cooling them off faster.
Now imagine you redlined a dozen cars for days on end in a garage in the middle of the summer do you think you might damage some components?
It is still very possible to use closed loop cooling on data centers but any system you build needs to be able to work in summer temps which can be as high as 35-40C and needs to do that without letting the computers exceed 60C. An air cooled system to handle that much heat is going to be very expensive and use a ton of power (and power generation also uses water)
over_clox@lemmy.world 18 hours ago
You’re almost right, but there do exist air cooled engines with no conventional radiator or water/antifreeze pump…
en.wikipedia.org/…/Volkswagen_air-cooled_engine
Many motorcycles also use air cooling.
PineRune@lemmy.world 1 day ago
That’s the real reason right there.