Comment on What’s next for Mozilla?

<- View Parent
averyminya@beehaw.org ⁨5⁩ ⁨months⁩ ago

I’ve been hopeful for an external hardware device, something akin to MythicAI’s analog hardware. It essentially offloads the heavy duty work done by the GPU, with far lower power consumption and about 98-99% accuracy, then sends the output data back to the computer to be digitized. Adding more tensor cores is just making more power consumption which is already an issue.

That company in particular was using this method for real time AI tracking in cameras but I feel like it could be easily adapted to effectively eliminate the work in AI that NVIDIA is doing for GPU’s. Why brute force AI with power and tensor cores when a couple wires and some voltage can sift through the same or larger models at the same.or faster speeds with, well okay about 98-99% accuracy. It could be a simple hardware attachment via PCIe or hell even USB with a small bottleneck for conversion times. I just used an app to upscale a photo locally on my phone, took about 14m (Xperia 1IV), I could easily have offloaded that work to an analog AI device. We are nearly to the point where we can just run “AI*” on a phone at nearly PC speeds.

All this to say - local AI indeed. The only way AI works is when everyone has access to it. Give full, free access to everybody and the fear of corporate interference drops drastically. There are plenty of models available online not made by Google or Microsoft pushing whatever or harvesting data back (remember to firewall your programs if you run them locally). Ideally tagsets could be open sourced but in the capitalist world I could also see independent artists selling models of their work under a license

/* Of course, AI as a broad spectrum term encompassing model based projects, LLM’s for assistants & generative imaging, and not the actual AI as a semi-autonomous intelligence

source
Sort:hotnewtop