Comment on But its the only thing I want!

<- View Parent
PeterPoopshit@lemmy.world ⁨10⁩ ⁨months⁩ ago

Yeah just use llamacpp which uses cpu instead of gpu. Any model you see on huggingface.co that has “GGUF” in the name is compatible with llamacpp as long as you’re compiling llamacpp from source using the github repository.

There is also gpt4all which is runs on llamacpp and is ui based but I’ve had trouble getting it to work.

source
Sort:hotnewtop