Comment on GOG job listing for a Senior Software Engineer notes "Linux is the next major frontier"

<- View Parent
Goodeye8@piefed.social ⁨1⁩ ⁨day⁩ ago

None of what you brought up as a positive are things an LLM does. Most of those things existed before the modern transformer-based LLMs were even a thing.

LLM-s are glorified text prediction engines and nothing about their nature makes them excel at formal languages. It doesn’t know any rules. It doesn’t have any internal logic. For example if the training data consistently exhibits the same flawed piece of code then an LLM will spit out the same flawed piece of code, because that’s the most likely continuation of its current “train of thought”. You would have to fine-tune the model around all those flaws and then hope some combination of a prompt won’t lead the model back into that flawed data.

I’ve used LLMs to generate SQL, which according to you is something they should excel at, and I’ve had to fix literal syntax errors that would prevent the statement from executing. A regular SQL linter would instantly pick up that the SQL is wrong but an LLM can’t pick up those errors because an LLM does not understand the syntax.

source
Sort:hotnewtop