Comment on

<- View Parent
knightly@pawb.social ⁨1⁩ ⁨day⁩ ago

This makes a lot of sense though, from the description it sounds like they’re trying to build an NPU out of memristors. We’ve been expecting them to show up to do this kind of math for a bit, since they’d cut a lot of redundant computation out of the layered matrix calculus that NPUs are optimized for if we can make them small, fast, and reliable enough.

And it’s not just for “AI”. A lot of problems, like physics modeling or speech recognition, can be reduced to matrix math. An analog, programmable memristor network can do that kind of calculus almost passively.

source
Sort:hotnewtop