Is there anyway to make it use less at it gets more advanced or will there be huge power plants just dedicated to AI all over the world soon?
My understanding is that traditional AI essentially takes a bruteforce approach to learning and because it is hardwired, its ability to learn and make logical connections is impaired.
Newer technologies like organic computers using neurons can change and adapt as it learns, forming new pathways for information to travel along, which reduces processing requirements and in turn, reduces power requirements.
ImplyingImplications@lemmy.ca 7 hours ago
It’s mostly the training/machine learning that is power hungry.
AI is essentially a giant equation that is generated via machine learning. You give it a prompt with an expected answer, it gets run through the equation, and you get an output. That output gets an error score based on how far it is from the expected answer. The variables of the equation are then modified so that the prompt will lead to a better output (one with a lower error).
The issue is that current AI models have billions of variables and will be trained on billions of prompts. Each variable will be tuned based on each prompt. That’s billions to the power of billions of calculations. It takes a while. AI researchers are of course looking for ways to speed up this process, but so far it’s mostly come down to dividing up these billions of calculations over millions of computers. Powering millions of computers is where the energy costs come from.
Unless AI models can be trained in a way that doesn’t require running a billion squared calculations, they’re only going to get more power hungry.