I mean, so far their most recent attempt at AI is a local AI based on PrivateGPT called MemoryCache.
Comment on What’s next for Mozilla?
taanegl@beehaw.org 11 months ago
Say it with me now: local AI, local AI… or fuck off.
InfiniWheel@lemmy.one 11 months ago
etrotta@beehaw.org 11 months ago
The vast majority of consumer devices, both mobile and laptops/desktops, are not powerful enough to run local AI with a good user experience yet, and even if they were, a lot of users would still prefer having it run in the cloud rather than using up their phone battery
averyminya@beehaw.org 11 months ago
I’ve been hopeful for an external hardware device, something akin to MythicAI’s analog hardware. It essentially offloads the heavy duty work done by the GPU, with far lower power consumption and about 98-99% accuracy, then sends the output data back to the computer to be digitized. Adding more tensor cores is just making more power consumption which is already an issue.
That company in particular was using this method for real time AI tracking in cameras but I feel like it could be easily adapted to effectively eliminate the work in AI that NVIDIA is doing for GPU’s. Why brute force AI with power and tensor cores when a couple wires and some voltage can sift through the same or larger models at the same.or faster speeds with, well okay about 98-99% accuracy. It could be a simple hardware attachment via PCIe or hell even USB with a small bottleneck for conversion times. I just used an app to upscale a photo locally on my phone, took about 14m (Xperia 1IV), I could easily have offloaded that work to an analog AI device. We are nearly to the point where we can just run “AI*” on a phone at nearly PC speeds.
All this to say - local AI indeed. The only way AI works is when everyone has access to it. Give full, free access to everybody and the fear of corporate interference drops drastically. There are plenty of models available online not made by Google or Microsoft pushing whatever or harvesting data back (remember to firewall your programs if you run them locally). Ideally tagsets could be open sourced but in the capitalist world I could also see independent artists selling models of their work under a license
/* Of course, AI as a broad spectrum term encompassing model based projects, LLM’s for assistants & generative imaging, and not the actual AI as a semi-autonomous intelligence
jarfil@beehaw.org 11 months ago
Neuromorphic hardware seems to be best suite
averyminya@beehaw.org 11 months ago
Oh yeah Intel’s version of that was looking promising too.
bilb@lem.monster 11 months ago
Local by default, option to go remote. Even the privacy-first types might want to offload that to a more powerful local machine.
They could even sell access to a Mozilla provided AI server like they do with the VPN service.
taanegl@beehaw.org 11 months ago
Maybe some “Folding@Home” kind of thing, to offload public AI projects. I.e decentralised processing.
Tau@sopuli.xyz 11 months ago
The use local models for Firefox Translations so I would expect they would do something similar