Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
Silentiea@lemmy.blahaj.zone ⁨7⁩ ⁨months⁩ ago

Someone else can probably describe it better than me, but basically if an LLM “sees” something, then it “follows” it. The way they work doesn’t really have a way to distinguish between “text I need to do what it says” and “text I need to know what it says but not do”.

They just have “text I need to predict what comes next after”. So if you show LLM2 the input from LLM1, then you are allowing the user to design at least part of a prompt that will be given to LLM2.

source
Sort:hotnewtop