They don’t need a ton of ram if you use a tiny LLM customized for the game’s use cases, and that’s what games would be doing.
Comment on Large Language Models in Video Games?
simple@lemm.ee 4 weeks ago
Not anytime soon. Nvidia tried, and nobody liked it. LLMs still suck at creative writing and need a ton of RAM/VRAM just to work. They also often get confused or trail off in any discussion/roleplay.
The only game that sort of made it work was Suck Up!, where you’re a vampire that has to convince an AI to let you in their house so you can suck their blood. It’s a fun concept but even that game gets repetitive quick and the LLM is very stupid and random.
icecreamtaco@lemmy.world 4 weeks ago
embed_me@programming.dev 4 weeks ago
The downside is the tinier the model the stupider it will be
icecreamtaco@lemmy.world 4 weeks ago
Tiny models only get stupid like that because you’re compressing a general purpose model that knows everything in the world farther than it can handle. If you start with a model that only knows how to speak basic english and info about a few hundred things in the game world, it can be much smaller.
Poopfeast420@discuss.tchncs.de 4 weeks ago
NVIDIA not just tried, but still doing it, and apparently soon you’ll play with these NVIDIA ACE NPCs in PUBG and a few other games.
www.youtube.com/watch?v=wEKUSMqrbzQ
Stovetop@lemmy.world 4 weeks ago
I don’t know what sounds more robotic, the AI or the script read for the player.