I don’t think it’s going to come down to these absurd datacentres. We’re only a few years off from platform-agnostic local inference at mass-market prices. Could I get a 5090? Yes. Legally? No.
Comment on AI Is A Money Trap
orca@orcas.enjoying.yachts 17 hours ago
Like Zitron says in the article, we’re 3 years into the AI era and there is not a single actually profitable company. It’s all smoke and mirrors and sketchy accounting. Even if/when the AI hype settles and perhaps the tech finds its true (profitable) calling, the tech itself is insanely expensive. It’s going to boil down to Microsoft and/or X owning nuclear power plants, and everyone else renting usage from them.
People are making money in AI, but like always, it’s the founders and C-suite, while the staff are kicked to the curb. It’s all a shell game and everyone that has integrated AI into their lives and company workflows, is gonna get the rug pulled out from under them.
Powderhorn@beehaw.org 16 hours ago
Feyd@programming.dev 5 hours ago
We’re only a few years off from platform-agnostic local inference at mass-market prices.
What makes you confident in that? What will change?
Powderhorn@beehaw.org 3 hours ago
There are already large local models. It’s a question of having the hardware, which has historically gotten more powerful with each generation. I don’t think it’s going to be phones for quite some time, but on desktop, absolutely.
Feyd@programming.dev 3 hours ago
For business use, laptops without powerful graphics cards have been the norm for quite some time. Do you see businesses deciding to change to desktops to accommodate the power for local models? I think it’s pretty optimistic to think that laptops are going to be that powerful in the next 5 years. The advancement in chip capability has dramatically slowed, and to put them in laptops they’d need to be incredibly more power efficient as well.
HakFoo@lemmy.sdf.org 14 hours ago
I have to think that most people won’t want to do local training.
It’s like Gentoo Linux. Yeah, you can compile everything with the exact optimal set of options for your kit, but at huge inefficiency when most use cases might be mostly served by two or three pre built options.
If you’re just running pre-made models, plenty of them will run on a 6900XT or whatever.
Powderhorn@beehaw.org 13 hours ago
I don’t expect anyone other than … I don’t even know what the current term is … geeks? batshit billionaires? to be doing training.
I’m very much of the belief that our next big leap in LLMs is local processing. Once my interactions stay on my device, I’ll jump in.
t3rmit3@beehaw.org 15 hours ago
This is a little misleading, because obviously FAANG (and others) are all building AI systems, and are all profitable. There are also tons of companies applying machine learning to various areas that are doing well from a profitability standpoint (mostly B2B SaaS that are enhancing extant tools). This statement is really only true for the glut of “AI companies” that do nothing but produce LLMs to plug into stuff.
My personal take is that this is just revealing how disconnected from the tech industry VCs are, who are the ones buying into this hype and burning billions of dollars on (as you said) smoke and mirrors companies like Anthropic and OpenAI.
megopie@beehaw.org 6 hours ago
The thing is, companies like Google, Facebook, Amazon and Microsoft are already profitable, so it could lose them huge amounts of money, with no real meaningful benefit to user retention or B2B sales, but the companies as a whole would still be profitable. It could be a huge money black hole, but they continue to chase it out of unjustified FOMO and in an attempt to keep share prices high through misplaced investor confidence.
Apple’s share price has taken a pretty big hit from the perception that they’re “falling behind” on AI, even if they’ve mostly just backed away from it because users didn’t like it when it was shoved in their face. Other companies are probably looking at that and saying “hey, we’d rather keep the stock market happy and our share prices high rather than stop wasting money on this”.
orca@orcas.enjoying.yachts 7 hours ago
I should reframe what I said: there is not a single profitable AI-focused company. There are tons of already profitable companies that are now deeply embedding AI into everything they do.
Feyd@programming.dev 7 hours ago
The fang companies that are in on the llm hype are still lighting money on fire in their llm endeavors so I feel to see how the point that they may be otherwise profitable is relevant.
Powderhorn@beehaw.org 14 hours ago
This is an interesting take in that only doing one thing but doing it well has been, historically, how businesses thrived. This vertical integration thing and startups looking to be bought out instead of trying to make it on their own (obviously, VCs play a role in this) has led to jacks of all trades.