Comment on 'LLM-free' is the new '100% organic' - Creators Are Fighting AI Anxiety With an ‘LLM-Free’ Movement
Zaktor@sopuli.xyz 6 months agoI’m referencing ChatGPT’s initial benchmarks to its capabilities to today. Observable improvements have been made in less than two years. Even if you just want to track time from the development of modern LLM transformers (All You Need is Attention/BERT), it’s still a short history with major gains (alexnet isn’t really meaningfully related). These haven’t been incremental changes on a slow and steady march to AI sometime in the scifi scale future.
PeteBauxigeg@lemm.ee 6 months ago
AlexNet is related, it was the first use of consumer gpus to train neutral networks no?
Zaktor@sopuli.xyz 5 months ago
No, not even remotely. And that’s kind of like citing “the first program to run on a CPU” as the start of development for any new algorithm.
PeteBauxigeg@lemm.ee 5 months ago
As far as I can find out, there was only one use of GPUs prior to alexnet for CNN, and it certainty didn’t have the impact alexnet had. Besides, running this stuff on GPUs not CPUs is a relevant technological breakthrough, imagine how slow chayGPT would be running on a CPU. And it’s not at all as obvious as it seems, most weather forecasts still run on CPU clusters despite them being obvious targets for GPUs.
Zaktor@sopuli.xyz 5 months ago
What? Alexnet wasn’t a breakthrough in that it used GPUs, it was a breakthrough for its depth and performance on image recognition benchmarks.
We knew GPUs could speed up neural networks in 2004. And I’m not sure that was even the first.