This makes a lot of sense though, from the description it sounds like they’re trying to build an NPU out of memristors. We’ve been expecting them to show up to do this kind of math for a bit, since they’d cut a lot of redundant computation out of the layered matrix calculus that NPUs are optimized for if we can make them small, fast, and reliable enough.
And it’s not just for “AI”. A lot of problems, like physics modeling or speech recognition, can be reduced to matrix math. An analog, programmable memristor network can do that kind of calculus almost passively.
XLE@piefed.social 15 hours ago
To be fair, nuclear energy need not be “for AI” either. I just find it ironic that these technologies and explorations only come up because the industry demands it… Sort of. AI companies are hoarding GPUs they haven’t even installed, and rapid improvements just cause them to lose value even faster.
In a better world, the whole “for AI” thing wouldn’t even be a factor.
knightly@pawb.social 15 hours ago
Absolutely. But at least that bubble is popping. Soon we won’t have to worry about AI mania anymore. XD