A GPT-4 level language model and current Flux dev can easily run on a standard M3 MacBook Air via ggml.
Comment on Not everything needs to be Art
cows_are_underrated@feddit.org 5 weeks agoThat’s why, its always good to run them locally(if you use them for fun)
merari42@lemmy.world 5 weeks ago
PeriodicallyPedantic@lemmy.ca 5 weeks ago
That’s almost certainly more wasteful. The machines they run them on are going to be far more efficient.
Running it locally is better because of all the other data mining that goes along with capitalism
cows_are_underrated@feddit.org 5 weeks ago
But usually you don’t cool your PC with Freshwater. That’s something a lot of datacenters do.
PeriodicallyPedantic@lemmy.ca 5 weeks ago
That’s true.
But in terms of power/emissions, data centers are far far better. The waste of potable water could be addressed if we make them, but the inefficiency of running locally cannot be.
I still prefer to run locally anyways, because fuck the kind of people who are trying to sell AI, but it is absolutely more inherently wasteful