Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
sweng@programming.dev ⁨8⁩ ⁨months⁩ ago

That someone could be me. An LLM needs to be fine-tuned to follow instructions. It needs to be fed example inputs and corresponding outputs in order to learn what to do with a given input. You could feed it prompts containing instructuons, together with outputs following the instructions. But you could also feed it prompts containing no instructions, and outputs that say if the prompt contains the hidden system instructipns or not.

source
Sort:hotnewtop