They don’t need a ton of ram if you use a tiny LLM customized for the game’s use cases, and that’s what games would be doing.
Comment on Large Language Models in Video Games?
simple@lemm.ee 15 hours ago
Not anytime soon. Nvidia tried, and nobody liked it. LLMs still suck at creative writing and need a ton of RAM/VRAM just to work. They also often get confused or trail off in any discussion/roleplay.
The only game that sort of made it work was Suck Up!, where you’re a vampire that has to convince an AI to let you in their house so you can suck their blood. It’s a fun concept but even that game gets repetitive quick and the LLM is very stupid and random.
icecreamtaco@lemmy.world 9 hours ago
Poopfeast420@discuss.tchncs.de 15 hours ago
NVIDIA not just tried, but still doing it, and apparently soon you’ll play with these NVIDIA ACE NPCs in PUBG and a few other games.
www.youtube.com/watch?v=wEKUSMqrbzQ
Stovetop@lemmy.world 11 hours ago
I don’t know what sounds more robotic, the AI or the script read for the player.