Yeah, and improvements will require paradigm changes. I don’t see that from GPT.
Comment on answer = sum(n) / len(n)
rbesfe@lemmy.ca 5 months agoMoore’s law died a long time ago, and AI isn’t getting any more efficient
someacnt_@lemmy.world 5 months ago
Daxtron2@startrek.website 5 months ago
GPT is not the end all be all of LLMs
someacnt_@lemmy.world 5 months ago
Are there LLMs with different paradigms?
Daxtron2@startrek.website 5 months ago
GPT is not a paradigm it’s a specific model family developed by openAI. You’re thinking of the transformers architecture. Check out a project like RWKV if you want to see a unique approach.
Daxtron2@startrek.website 5 months ago
Then you haven’t been paying attention. There’s been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.