Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

I'm sorry, little one

⁨347⁩ ⁨likes⁩

Submitted ⁨⁨10⁩ ⁨months⁩ ago⁩ by ⁨RandomlyRight@sh.itjust.works⁩ to ⁨[deleted]⁩

https://sh.itjust.works/pictrs/image/8f44d4e6-36c5-4e4f-9754-487172cd0b6f.jpeg

source

Comments

Sort:hotnewtop
  • henfredemars@infosec.pub ⁨10⁩ ⁨months⁩ ago

    Forget the board – can your whimpy-ass power supply handle the load?

    source
    • RandomlyRight@sh.itjust.works ⁨10⁩ ⁨months⁩ ago

      No :(

      I have a separate gaming PC and am considering to just use that hardware for my NAS and create a VM for gaming

      source
      • saltesc@lemmy.world ⁨10⁩ ⁨months⁩ ago

        You have put yourself into this black hole lol.

        “I might just get a- Oh god my gaming rig is now my secondary PC and my credit card hurts. How did this happen?!”

        3090s snicker evily in the background

        source
        • -> View More Comments
      • MentalEdge@sopuli.xyz ⁨10⁩ ⁨months⁩ ago

        Didn’t someone just make a post about a game stream server that would allow multi gamers to use the same machine? Not with VMs, but multiple users and virtual displays.

        You’d connect to it via any moonlight client, and it creates the environment for you to use the machine for whatever.

        source
        • -> View More Comments
      • pleb_maximus@feddit.de ⁨10⁩ ⁨months⁩ ago

        Look at it this way: not only can you run your own AI stuff yourself, you can have your own cloud gaming too!

        source
  • MotoAsh@lemmy.world ⁨10⁩ ⁨months⁩ ago

    Why use commercial graphics accelerators to run a highly limited “AI”-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090’s.

    source
    • ShadowRam@fedia.io ⁨10⁩ ⁨months⁩ ago

      Well yeah, but 10x the price....

      source
      • MotoAsh@lemmy.world ⁨10⁩ ⁨months⁩ ago

        Not if it’s for inference only. What do you think the “AI acceledators” they’re putting in phones now are?

        source
        • -> View More Comments
    • mergingapples@lemmy.world ⁨10⁩ ⁨months⁩ ago

      Because those specific cards are fuckloads more expensive.

      source
    • d00ery@lemmy.world ⁨10⁩ ⁨months⁩ ago

      What are you recommending, I’d be interested in something that’s similar in price to 3090.

      source
    • Diabolo96@lemmy.dbzer0.com ⁨10⁩ ⁨months⁩ ago

      It’s for inference, not training.

      source
      • MotoAsh@lemmy.world ⁨10⁩ ⁨months⁩ ago

        Even better, because those are cheap as hell compared to 3090s.

        source
        • -> View More Comments
    • VeganCheesecake@lemmy.blahaj.zone ⁨10⁩ ⁨months⁩ ago

      Would you link one? Because the only things I know of are the small coral accelerators that aren’t really comparable, and specialised data centre stuff you need to request quotes for to even get a price, from companies that probably aren’t much interested in seeing one direct to customer.

      source
    • GBU_28@lemm.ee ⁨10⁩ ⁨months⁩ ago

      Huh?

      Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.

      source