Apparently data centers routinely burn through water at a rate of about 1.9 liters per KWh of energy spent computing. Yet I can 🎮 HARDCORE GAME 🎮 on my hundreds-of-watts GPU for several hours, without pouring any of my Mountain Dew into the computer? Even if the PC is water cooled, the water cooling water stays in the computer, except for exceptional circumstances.
Meanwhile, water comes out of my A/C unit and makes the ground around it all muddy.
How am I running circles around the water efficiency of a huge AI data center, with an overall negative water consumption?
BlameThePeacock@lemmy.ca 1 day ago
The simply answer is that your A/C dumps heat outside using big metal fins, it’s not terribly great, but it works well at that scale.
Dissipating it into the air for the amount of heat some data centers need to get rid of doesn’t cut it, so they use evaporative coolers.
The phase change of evaporating water from liquid to gas uses approximately 7x more heat energy than taking room temperature water and getting it up the boiling point itself.
Essentially they stick their large metal fins from the AC into a large pool of water and boil it off. This gets rid of the energy with a much smaller and cheaper system, but uses up water.
Brkdncr@lemmy.world 1 day ago
Home hvac units don’t let in fresh air all of the time, they recycle air.
BlameThePeacock@lemmy.ca 1 day ago
The HVAC system no, the home itself, yes.
Depending on how old your home is of course, newer homes tend to have lower exchange rates.
Also datacenters don’t have windows, or even doors constantly letting people in and out of cooled areas and outside.
sartalon@lemmy.world 1 day ago
My HVAC system has an inside and outside air source.