Because those specific cards are fuckloads more expensive.
Comment on I'm sorry, little one
MotoAsh@lemmy.world 3 months ago
Why use commercial graphics accelerators to run a highly limited “AI”-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090’s.
mergingapples@lemmy.world 3 months ago
d00ery@lemmy.world 3 months ago
What are you recommending, I’d be interested in something that’s similar in price to 3090.
Diabolo96@lemmy.dbzer0.com 3 months ago
It’s for inference, not training.
MotoAsh@lemmy.world 3 months ago
Even better, because those are cheap as hell compared to 3090s.
Diabolo96@lemmy.dbzer0.com 3 months ago
But can they run Crysis ?
VeganCheesecake@lemmy.blahaj.zone 3 months ago
Would you link one? Because the only things I know of are the small coral accelerators that aren’t really comparable, and specialised data centre stuff you need to request quotes for to even get a price, from companies that probably aren’t much interested in seeing one direct to customer.
GBU_28@lemm.ee 3 months ago
Huh?
Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.
ShadowRam@fedia.io 3 months ago
Well yeah, but 10x the price....
MotoAsh@lemmy.world 3 months ago
Not if it’s for inference only. What do you think the “AI acceledators” they’re putting in phones now are?
ShadowRam@fedia.io 3 months ago
Ok,
Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.
RandomlyRight@sh.itjust.works 3 months ago
I’d be interested (and surprised) too
RandomlyRight@sh.itjust.works 3 months ago
Yeah show me a phone with 48GB RAM. It’s. Big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference
Fuzzypyro@lemmy.world 3 months ago
I wouldn’t say it particularly sucks. It could be used as a powerhouse hosting server. Docker makes it very easy to do no matter the os now a days. Really though I’d say its competition is more along the lines of ampere systems in terms of power to performance. It even beats amperes 128 core arm cpu at a power to performance ratio which is extremely impressive in the server/enterprise world. Not to say you’re gonna see them in data centers because price to performance is a thing as well. I just feel like it fits right into the niche it was designed for.