How much do you know about transformers?
Have you ever programmed an interpreter for interactive fiction / MUDs? It’s a great example of the power that even super tiny models can accomplish.
Also consider that Firefox or Electron apps require more RAM and CPU and waste more energy than small language models. A Gemma slm can translate things into English using less energy than it requires to open a modern browser.
itkovian@lemmy.world 1 week ago
I am not implying that transformers-based models have to be huge to be useful. I am only talking about LLMs. I am questioning the purported goal of LLMs, i.e., to replace all humans in as many creative fields as possible, in the context of it’s cost, both environmental and social.