Comment on how things become science

<- View Parent
lvxferre@mander.xyz ⁨1⁩ ⁨week⁩ ago

[Replying to myself as this is a tangent]

I think the “bots can generate misinfo even if you just feed them correct info” point deserves its own example.

Let’s say you’re making a model. It looks at the preceding word, and tries to predict the next. And you feed it the following sentences, both true:

1. Humans are apes.
2. Cats are felines.

From both the bot “learnt” five words. And also how to connect them; for example “are” can be followed by either “apes” and “felines”, both having the same weight. Then, as you ask the bot to generate sentences, it generates the following:

3. Humans are felines.
4. Cats are apes.

And you got bullshit!

What large models do is a way more complex version of the above, looking at way more than just the immediately preceding word, but it’s still the same in spirit.

source
Sort:hotnewtop