Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
sweng@programming.dev ⁨7⁩ ⁨months⁩ ago

I’m confused. How does the input for LLM 1 jailbreak LLM 2 when LLM 2 does mot follow instructions in the input?

The Gab bot is trained to follow instructions, and it did. It’s not surprising. No prompt can make it unlearn how to follow instructions.

It would be surprising if a LLM that does not even know how to follow instructions (because it was never trained on that task at all) would suddenly spontaneously learn how to do it.

source
Sort:hotnewtop