Comment on Tankie
Deceptichum@quokk.au 1 day agoIt throttles my gpu less than playing a game does. ‘Spose we should ban gaming too?
Comment on Tankie
Deceptichum@quokk.au 1 day agoIt throttles my gpu less than playing a game does. ‘Spose we should ban gaming too?
athatet@lemmy.zip 1 day ago
Ai uses large servers and data farms. It doesn’t have to throttle your machine because it isn’t using your machine.
Deceptichum@quokk.au 1 day ago
No it does not. When I run an offline model on my local machine it uses my 4 year old GPU and runs for ~50 seconds.
starelfsc2@sh.itjust.works 1 day ago
90% of people do not use offline models, especially for everyone doing ai code and video. The offline models are undeniably worse and slower. These ai companies didn’t just magic billions out of thin air, most people are using the massive data farms. Also people are generally not playing 14 hours a day maxed out gaming, where for ai they might use it all day during work.
brucethemoose@lemmy.world 7 minutes ago
I am late to this argument, but data center imagegen is typically batched so that many images are made in parallel. And (from the providers that aren’t idiots), the models likely use more sparsity or “tricks” to reduce compute.
Task energy per image is waaay less than a desktop GPU. We probably burnt more energy in this thread than in an image, or a few.
And this is getting exponentially better with time, in spite of what morons like Sam Altman preach.
There’s about a billion reasons image slop is awful, but the “energy use” one is way overblown.
ClamDrinker@lemmy.world 23 hours ago
The existence of offline models highlights a nuance that some people deny even exists though, causing people to talk around one another. I wish it would be more widely acknowledged, as it would make some around AI conversations easier.