OpenAI's first open source language model since GPT-2
No thanks.
Submitted 4 days ago by ryujin470@fedia.io to technology@beehaw.org
https://www.theverge.com/openai/718785/openai-gpt-oss-open-model-release
OpenAI's first open source language model since GPT-2
paywall free version?
nathan@piefed.alphapuggle.dev 4 days ago
*if you have a laptop with 16gb of vram. Otherwise you'll be watching ollama hit your CPU for 5 minutes with no output
sefra1@lemmy.zip 4 days ago
Isn’t that true for most models until someone destiles and quantises them so they can run on common hardware?
fuckwit_mcbumcrumble@lemmy.dbzer0.com 4 days ago
This is the internet, we’re only allowed to be snarky here.
CyberSeeker@discuss.tchncs.de 4 days ago
Yes, but 20 billion parameters is too much for most GPUs, regardless of quantization. You would need at least 14GB, and even that’s unlikely without offloading major parts to the CPU and system RAM (which kills the token rate).
Ghoelian@lemmy.dbzer0.com 4 days ago
I mean yeah, but that doesn’t make the title any lore true.