Submitted 5 hours ago by cm0002@sh.itjust.works to technology@lemmy.zip
https://technode.com/2025/10/06/huawei-zurich-labs-new-open-source-tech-lets-llms-run-on-consumer-gpus/
Little sparse on detail, I regularly run LLMs on 5 year old CPUs so no problem there, I wonder how the approach compares in memory requirements to existing quantization methods.
wizzor@sopuli.xyz 3 hours ago
Little sparse on detail, I regularly run LLMs on 5 year old CPUs so no problem there, I wonder how the approach compares in memory requirements to existing quantization methods.