Comment on Researchers show that training on “junk data” can lead to LLM “brain rot”

jarfil@beehaw.org ⁨3⁩ ⁨weeks⁩ ago

Using a complex GPT-4o prompt, they sought to pull out tweets that focused on “superficial topics

Wait a moment… They asked an LLM, to tell them what was “junk”, and another LLM, trained on what an LLM marked as junk, turned out to be a junk LLM?

It talks about model collapse, but this smells like research collapse.

source
Sort:hotnewtop