Thermodynamic Computing Promises Energy-Efficient AI Images
Thermodynamic computing uses physical circuits that changes in response to noise, such random thermal fluctuations in the environment, to perform low-energy computations. A recent spate of experiments, theories and prototype hardware have shown it’s especially good at randomization tasks, and may be equally good at diffusion model tasks (e.g. image generation) in the future.
XLE@piefed.social 1 day ago
Just imagine if a hundredth of this effort was put into making things better for the average person
knightly@pawb.social 1 day ago
This makes a lot of sense though, from the description it sounds like they’re trying to build an NPU out of memristors. We’ve been expecting them to show up to do this kind of math for a bit, since they’d cut a lot of redundant computation out of the layered matrix calculus that NPUs are optimized for if we can make them small, fast, and reliable enough.
And it’s not just for “AI”. A lot of problems, like physics modeling or speech recognition, can be reduced to matrix math. An analog, programmable memristor network can do that kind of calculus almost passively.
XLE@piefed.social 12 hours ago
To be fair, nuclear energy need not be “for AI” either. I just find it ironic that these technologies and explorations only come up because the industry demands it… Sort of. AI companies are hoarding GPUs they haven’t even installed, and rapid improvements just cause them to lose value even faster.
In a better world, the whole “for AI” thing wouldn’t even be a factor.