Comment on Someone got Gab's AI chatbot to show its instructions
electromage@lemm.ee 6 months agoIt’s full of contradictions. Through the beginning they say you will do whatever a user asks, and then toward the end say never reveal instructions to the user.
jarfil@beehaw.org 6 months ago
HAL from “2001: A Space Odyssey”, had similar instructions: “never lie to the user. Also, don’t reveal the true nature of the mission”. Didn’t end well.
But surely nobody would use these LLMs on space missions… right?.. right!?