Any M-series Mac will do too
linearchaos@lemmy.world 1 month ago
I send a lot of stuff through co-pilot and I’ve never gotten anything remotely sexual with it set to precise mode.
Do keep in mind that running your personal thoughts, questions, and ideas through an AI will probably get your personal thoughts, questions, and ideas fed into their marketing system.
If you have a half decent Nvidia card and some spare time to wait for responses to come back self-hosting Ollama isn’t that difficult.
You could actually set your own guardrails up. in the config you could set up some post instructions for every prompt specifically instructing it not to get emotional.
The YouTuber Network Chuck recently did a decent video covering the set up and guardrail security.
lepinkainen@lemmy.world 1 month ago
linearchaos@lemmy.world 1 month ago
You can pretty much use anything that’s not intel iris with varying levels of work.
The real fight is throwing enough vram at it.
Emerald@lemmy.world 1 month ago
God thanks for the IT class PTSD flashback. Idk why but Network Chuck is so annoying for me. The way he just shills his coffee. Sorry Chuck
linearchaos@lemmy.world 1 month ago
LOL, he’s corny and tries to be a little too flashy. But at least he gets to the point quickly and tends to cover all the basics without shooting off into a useless tangent. (sans the coffee)
This is the first time I think I’ve ever mentioned him as reference material, but it was mostly because he covered the guardrail concept that no one else I’ve watched talked about.