I only eff around with it occasionally. I run it on a MacBook Pro M1 Max. It’s solid for performance. I don’t have a job where I can employ it regularly, so after initial testing, I barely use it.
I only eff around with it occasionally. I run it on a MacBook Pro M1 Max. It’s solid for performance. I don’t have a job where I can employ it regularly, so after initial testing, I barely use it.
BeardedGingerWonder@feddit.uk 5 days ago
Fair, I’m kinda wondering about having a general local household ai, I’ve got no good reason for it other than general tinkering. I’m somewhat waiting for the crossover between decent ai and affordable hardware to occur.
danzabia@infosec.pub 4 days ago
I’ve been running Gemma3 4b locally on ollama and it’s useful. I’m thinking about applications where a multimodal model could receive video or sensor feeds (like a security can, say).