Comment on Deepseek when asked about sensitive topics
fmstrat@lemmy.nowsci.com 10 months agoOllama’s version is distilled with Qwen or Llama depending on parameter size, so it’s going to behave very different than the original model, since it is very different.
codexarcanum@lemmy.dbzer0.com 10 months ago
Except if you look at the top of OP’s picture, they are also running deepseek-r1:14B through ollama. I downloaded my copy on Sunday, so these should be fairly comparable situations.
fmstrat@lemmy.nowsci.com 10 months ago
I… replied to a comment vs OP. Doh!