So I did some more research, and evidently if you’re going to use AI at all, you’re probably increasing your energy usage by using it offline if you use it often (unless you are using renewables), since the data centers generally have cards specifically designed for AI. I think it might just be a case of everyone needs to use it significantly less, it’s like if 4k gaming was something the average joe was doing. If everyone was doing that 10 hours a day, we would have a big problem.
It’s kinda like saying it’s not immoral to go for a pleasure drive, but if you’re driving around 10 hours a day that’s probably not good and you should minimize it as much as you can.
bbboi@feddit.uk 22 hours ago
Does the physical location of the hardware really matter?
ClamDrinker@lemmy.world 20 hours ago
It does not but that wasn’t my point. It was that that not all forms of AI usage are the same. The same way someone driving around an EV that they charge with solar power isn’t the same as someone driving a 1969 oil guzzler (or something equivalent). Local usage more often than not means efficient models, low energy consumption, and little difference to other computer tasks like gaming or video editing. But when the conversation is around AI, there is always the explicit expectation that it’s the worst of the worst all the time.