Comment on You probably shouldn't trust the info anyway.
possiblylinux127@lemmy.zip 1 month ago
Use LLMs running locally. Mistral is pretty solid and isn’t a surveillance tool or censorship heavy. It will happily write a poem about obesity
Comment on You probably shouldn't trust the info anyway.
possiblylinux127@lemmy.zip 1 month ago
Use LLMs running locally. Mistral is pretty solid and isn’t a surveillance tool or censorship heavy. It will happily write a poem about obesity
BaroqueInMind@lemmy.one 1 month ago
Hermes3 is better in every way.
bruhduh@lemmy.world 1 month ago
Hermes 8b is better than mixtral 8x7b?
BaroqueInMind@lemmy.one 1 month ago
Hermes3 is based on Llama3.1, Mixtral 8x7B is based on Llama 2. Take a guess which one is better.
MystikIncarnate@lemmy.ca 1 month ago
Okay, but fucking pages sounds like a good way to get papercuts in places I don’t want papercuts.
possiblylinux127@lemmy.zip 1 month ago
ollama.com/library/hermes3
BaroqueInMind@lemmy.one 1 month ago
What are you talking about? It follows the Llama 3 Meta license, and pretty much every LLM that isn’t a dogshit copyright-stealing Alibaba Quen model uses it.
possiblylinux127@lemmy.zip 1 month ago
Mistral is licensed under the Apache license version 2.0. This license is recognized under the GNU project and under the Open source initiative. This is because it protects your freedom.
Meanwhile the Meta license places restrictions on use and arbitrary requirements. It is those requirements that lead me to choose not to use it. The issue with LLM licensing is still open but I certainly do not want a EULA style license with rules and restrictions.