Actually if you think about it AI might help climate change become an actual catastrophe.
Comment on Black Mirror AI
IndiBrony@lemmy.world 16 hours agoAll the while as we roast to death because all of this will take more resources than the entire energy output of a medium sized country.
Zozano@aussie.zone 15 hours ago
I’ve been think about this for a while. Consider how quick LLM’s are.
If the amount of energy spent powering your device (without an LLM), is more than using an LLM, then it’s probably saving energy.
In all honesty, I’ve probably saved over 50 hours or more since I starred using it about 2 months ago.
Coding has become incredibly efficient, and I’m not suffering through search-engine hell any more.
xthexder@l.sw0.com 7 hours ago
Just writing code uses almost no energy. Your PC should be clocking down when you’re not doing anything. 1GHz is plenty for text editing.
Does ChatGPT reduce the number of times you hit build? Because that’s where all the electricity goes.
Aux@feddit.uk 5 hours ago
What kind of code are you writing that your CPU goes to sleep? If you follow any good practices like TDD, atomic commits, etc, and your code base is larger than hello world, your PC will be running at its peak quite a lot.
Example: linting on every commit + TDD. You’ll be making loads of commits every day, linting a decent code base will definitely push your CPU to 100% for a few seconds. Running tests, even with caches, will push CPU to 100% for a few minutes. Plus compilation for running the app, some apps take hours to compile.
In general, text editing is a small part of the developer workflow. Only junior devs spend a lot of time typing stuff.
Zozano@aussie.zone 7 hours ago
Except that half the time I dont know what the fuck on doing. It’s normal for me to spend hours trying to figure out why a small config file isnt working.
That’s not just text editing, that’s browsing the internet, referring to YouTube videos, or wallowing in self-pity.
That was before I started using gpt.
xthexder@l.sw0.com 7 hours ago
It sounds like it does save you a lot of time then. I haven’t had the same experience, but I did all my learning to program before LLMs.
Personally I think the amount of power saved here is negligible, but it would actually be an interesting study to see just how much it is.
ryannathans@aussie.zone 15 hours ago
Are you using your PC less hours per day?
Zozano@aussie.zone 15 hours ago
Yep, more time for doing home renovations.
Eyekaytee@aussie.zone 14 hours ago
we’re rolling out renewables at like 100x the rate of ai electricity use, so no need to worry there
Serinus@lemmy.world 14 hours ago
Yeah, at this rate we’ll be just fine. (As long as this is still the Reagan administration.)
Eyekaytee@aussie.zone 13 hours ago
yep the biggest worry isn’t AI, it’s India
www.worldometers.info/…/india-co2-emissions/
The west is lowering its co2 output while India is slurping up all the co2 we’re saving:
This doesn’t include China of course, the most egregious of the co2 emitters
AI is not even a tiny blip on that radar, especially as AI is in data centres and devices which runs on electricity so the more your country goes to renewables the less co2 impacting it is over time
Semjaza@lemmynsfw.com 12 hours ago
Could you add the US to the graphs, as EU and West are hardly synonymous - even as it descends into Trumpgardia.
zedcell@lemmygrad.ml 7 hours ago
Now break that shit down per capita, and also try and account for the fact that China is a huge manufacturing hub for the entire world’s consumption, you jackass.
m532@lemmygrad.ml 6 hours ago
India has extremely low historical co2 output, crakkker
vivendi@programming.dev 9 hours ago
Image
lipilee@feddit.nl 7 hours ago
water != energy, but i’m actually here for the science if you happen to find it.
vivendi@programming.dev 7 hours ago
This particular graph is because a lot of people freaked out over “AI draining oceans” that’s why the original paper (I’ll look for it when I have time, I have a exam tomorrow. Fucking higher ed man) made this graph
xthexder@l.sw0.com 7 hours ago
Asking ChatGPT a question doesn’t take 1 hour like most of these… this is a very misleading graph
vivendi@programming.dev 7 hours ago
This is actually misleading in the other direction: ChatGPT is a particularly intensive model. You can run a GPT-4o class model on a consumer mid to high end GPU which would then use something in the ballpark of gaming in terms od environmental impact.
You can also run a cluster of 3090s or 4090s to train the model, which is what people do actually, in which case it’s still in the same range as gaming. (And more productive than 8 hours of WoW grind while chugging a warmed up Nutella glass as a drink).
Models like Google’s Gemma (NOT Gemini these are two completely different things) are insanely power efficient.
xthexder@l.sw0.com 7 hours ago
I didn’t even say which direction it was misleading, it’s just not really a valid comparison to compare a single invocation of an LLM with a continuous task.
You’re comparing Volume of Water with Flow Rate. Or if this was power, you’d be comparing Energy (Joules or kWh) with Power (Watts)
Sorse@discuss.tchncs.de 7 hours ago
What about training an AI?
vivendi@programming.dev 7 hours ago
According to arxiv.org/abs/2405.21015
The absolute most monstrous, energy guzzling model tested needed 10 MW of power to train.
Most models need less than that, and non-frontier models can even be trained on gaming hardware with comparatively little energy consumption.
That paper by the way says there is a 2.4x increase YoY for model training compute, BUT that paper doesn’t mention DeepSeek, which rocked the western AI world with comparatively little training cost (2.7 M GPU Hours in total)