Not a word on Chinese models being censored in the article. What an odd omission.
It should also be pretty obvious that this is following the usual Chinese MO of using massive state subsidies to destroy the international competition with impossibly low dumping prices. We are seeing this in all sorts of sectors.
MagicShel@lemmy.zip 5 days ago
I went to go install it this morning to check it out, but I had to decline when I read the privacy policy. I might check it out on my desktop where I have a lot more tools to ensure my anonymity, but I’m not installing it on my phone. There is not one scrap of data you generate that they aren’t going to hoover up, combine with data they get from anyone who will sell it to them, and then turn around and resell it.
I’m sure other apps are just as egregious, which is one reason I’ve been deliberately moving away from native apps to WPAs. Yes, everything you can possibly do on the internet is a travesty for privacy, but I’m not going to be on the leading edge of giving myself to be sold.
qprimed@lemmy.ml 5 days ago
kudos on poking at the app privacy statement. the real interest in this is going to be running it locally on your own server backend.
so, yeah - as usual, apps bad, bad, bad. but the backend is what really matters.
pupbiru@aussie.zone 4 days ago
it’s actually pretty easy to run locally as well. obviously not as easy as just downloading an app, but it’s gotten relatively straight-forward and the peace of mind is nice
check out ollama, and find an ollama UI
MagicShel@lemmy.zip 4 days ago
That’s not the monster model, though. But yes, I run AI locally (barely on my 1660). What I can run locally is pretty decent in limited ways, but I want to see the o1 competitor.