Comment on Your CV is not fit for the 21st century
Megaman_EXE@beehaw.org 4 days agoSorry if this is a stupid question, but is there a good place to figure out how to run LLM’s locally? Seems safer than entering personal data onto a server somewhere
ulo@beehaw.org 4 days ago
As a start, you could take a look at Ollama, which seems to be available in many package managers if you use one. I’ve done some experimenting with mistral-nemo, but you should pick a model size appropriate to your hardware and use case. I believe there are GUIs and extensions for Ollama, but as someone with a low interest in LLMs, I’ve only used the bare bones features through my terminal, and I haven’t used it for any projects or tasks.
You definitely shouldn’t trust it to teach you anything (I’ve seen some highly concerning errors in my tests), but it might be useful to you if you can verify the outputs.
Also check out the PrivacyGuides page on LLMs.
Megaman_EXE@beehaw.org 4 days ago
Thank you for the information! Yeah, I don’t really trust them. They feel flimsy and unreliable for most things. Sometimes, they have their moments where they seem actually helpful.
I hate their usage overall, I just figure if I need it to help me land a job at some point, I should probably just have some extra options ready.