Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
sweng@programming.dev ⁨6⁩ ⁨months⁩ ago

No. Consider a model that has been trained on a bunch of inputs, and each corresponding output has been “yes” or “no”. Why would it suddenly reproduce something completely different, that coincidentally happens to be the input?

source
Sort:hotnewtop