Comment on AI Training Slop
trungulox@lemm.ee 4 weeks agoDon’t know. Don’t really care honestly. I dont pay for hydro, and whatever energy expenditures were involved in training the model I fine tuned is more than offset by the fact that I don’t and never will drive.
utopiah@lemmy.world 4 weeks ago
That’s some strange logic. Either you do know and you can estimate that the offset will indeed “balance it out” or you don’t then you can’t say one way or the other.
jfrnz@lemm.ee 4 weeks ago
Running a 500W GPU 24/7 for a full year is less than a quarter of the energy consumed by the average automobile in the US (in 2000). I don’t know how many GPUs this person has or how long it took to fine tune the model, but it’s clearly not creating an ecological disaster. Please understand there is a huge difference between the power consumed by companies training cutting-edge models at massive scale/speed, compared to a locally deployed model doing only fine tuning and inferencing.
utopiah@lemmy.world 4 weeks ago
I specifically asked about the training part, not the fine tuning but thanks to clarifying.
jfrnz@lemm.ee 4 weeks ago
The point is that OP (most probably) didn’t train it — they downloaded a pre-trained model and only did fine-tuning and inference.
trungulox@lemm.ee 4 weeks ago
Herpa Derpa flurbidy
utopiah@lemmy.world 4 weeks ago
I see. Well, I checked your post history because I thought “Heck, they sound smart, maybe I’m the problem.” and my conclusion based on the floral language you often use with others is that you are clearly provoking on purpose.
Unfortunately I don’t have the luxury of time to argue this way so I’ll just block you, this way we won’t have to interact in the future.
Take care and may we never speak again.
trungulox@lemm.ee 4 weeks ago
Erby glerby skeibledee thought terminating cliches groppily boop