A small one would still be plenty of GPU
And the big ones could still need to liquidate old stuff when it starts using too much power compared to newer stuff, even if it still works (though someone in this thread claimed data centers burn through GPUs too fast)
jarfil@beehaw.org 4 days ago
Depending on how much is “too much power”, people might still want to purchase them at a discount for self-hosting purposes. The future is most likely to go through a decentralization of AI services, with higher efficiency large providers, combined with lower efficiency edge nodes for low and less demanding usage.