Comment on Someone got Gab's AI chatbot to show its instructions

<- View Parent
sweng@programming.dev ⁨7⁩ ⁨months⁩ ago

Moving goalposts, you are the one who said even 1000x would not matter.

The second one does not run on the same principles, and the same exploits would not work against it, e g. it does not accept user commands, it uses different training data, maybe a different architecture even.

You need a prompt that not only exploits two completely different models, but exploits them both at the same time. Claiming that is a 2x increase in difficulty is absurd.

source
Sort:hotnewtop