some_guy@lemmy.sdf.org 1 week ago
That’s the second time in three days that I’ve seen an article where “AI” (machine learning) was actually useful. It’s a hype machine and it’s overvalued, but it’s nice to see it being useful. I still can’t wait for OpenAI to fail. I run the Llama model locally because to hell with giving corps more of my data. Anyway…
BeardedGingerWonder@feddit.uk 1 week ago
Out of curiosity, what’s your use case and spec of the machine running it?
some_guy@lemmy.sdf.org 6 days ago
I only eff around with it occasionally. I run it on a MacBook Pro M1 Max. It’s solid for performance. I don’t have a job where I can employ it regularly, so after initial testing, I barely use it.
BeardedGingerWonder@feddit.uk 6 days ago
Fair, I’m kinda wondering about having a general local household ai, I’ve got no good reason for it other than general tinkering. I’m somewhat waiting for the crossover between decent ai and affordable hardware to occur.
danzabia@infosec.pub 5 days ago
I’ve been running Gemma3 4b locally on ollama and it’s useful. I’m thinking about applications where a multimodal model could receive video or sensor feeds (like a security can, say).