Comment on AI hallucinations are getting worse – and they're here to stay

<- View Parent
lvxferre@mander.xyz ⁨2⁩ ⁨days⁩ ago

Yes, it is expensive. But most of that cost is not because of simple applications, like in my example with grammar tables. It’s because those models have been scaled up to a bazillion parameters and “trained” with a gorillabyte of scrapped data, in the hopes they’ll magically reach sentience and stop telling you to put glue on pizza. It’s because of meaning (semantics and pragmatics), not grammar.

Also, natural languages don’t really have nonsensical rules; sure, sometimes you see some weird stuff (like Italian genderbending plurals, or English question formation), but even those are procedural: “if X, do Y”. LLMs are actually rather good at regenerating those procedural rules based on examples from the data.

But I wish it had some broader use, that would justify its cost.

I hope the opposite - that they cut down the costs based on the current uses. Small models for specific applications, dirty cheap in both training and running costs.

source
Sort:hotnewtop