Yeah, if we just take heat pumps for example, or even cpu water coolers, the heat is carried away from where it’s hot to somewhere where it can be radiated off and equilibrium of heat conducting material and surrounding occurs.
You can bet your ass that these US data center are just brute forcing heat exchange via evaporation instead to make the initial investment cheaper. It’s the equivalent to burning coal instead of straight up going for the renewable but initially more costly option when it comes to energy production.
Comment on How do AI data centers manage to *consume* water, but when I cool my house, my A/C *makes* water?
Darkassassin07@lemmy.ca 5 days agoYeah, thermodynamics are a thing. I’m not trying to claim some free energy system saying you could power the whole data center; but if you could re-capture some of the waste heat and convert it back into electricity, putting that energy to work instead of just venting to atmosphere, it could potentially help offset some of the raw electrical needs. An efficiency improvement, that’s all.
dubyakay@lemmy.ca 5 days ago
SaltSong@startrek.website 4 days ago
I’m an actual engineer with a degree and everything, although this is not my area of expertise, it’s one I’m familiar with.
They could do something like you suggest, but every step becomes more expensive and less effective. The exhaust from a coal fired power plant is still plenty hot, and more energy could be extracted from it. But it requires more and more to make less and less.
The curse of every engineer is to see a way to them every waste stream into a useful product, but not being able to do so profitably. (Which means no-one will approve the project)