Comment on Ollama bug allows drive-by attacks - patch now

TehPers@beehaw.org ⁨1⁩ ⁨week⁩ ago

This makes me less enthusiastic about local models. I mean, nothing on the internet is inherently secure and the patch came quickly, but local LLMs being hackable in the first place opens a new can of worms.

Everything downloaded from the internet is hackable. Web browsers are the most notorious for being attacked, and regularly need to mitigate exploitable vulnerabilities. What’s important is how they fix the vulnerability and how they prevent it from happening again in the future.

Personally, when I do run Ollama, it’s always from within a container. I mostly do this because I find it more convenient to run it this way, but it also adds a degree of separation between its running environment and my personal computer. Note that this is not a sandbox (especially since it still uses my GPU and executes code locally), just a small layer of protection.

source
Sort:hotnewtop