Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
sweng@programming.dev ⁨8⁩ ⁨months⁩ ago

Can you explain how you would jailbfeak it, if it does not actually follow any instructions in the prompt at all? A model does not magically learn to follow instructuons if you don’t train it to do so.

source
Sort:hotnewtop