Comment on Someone got Gab's AI chatbot to show its instructions

Gaywallet@beehaw.org ⁨6⁩ ⁨months⁩ ago

It’s hilariously easy to get these AI tools to reveal their prompts

Image

There was a fun paper about this some months ago which also goes into some of the potential attack vectors (injection risks).

source
Sort:hotnewtop