Comment on oh ok
REDACTED@infosec.pub 3 days agoYou’re thinking about biological lying. I’m talking about software.
en.wikipedia.org/wiki/Reasoning_system
If the question was to tell it’s darkest secret, but it instead chose to come up with an entertaining story instead of factually answering that question, like other Anthropic LLM models did, then by definition of reasoning system, the system (LLM) decided to lie. There are variables that made it “decide” this route, therefore it is not a static, expected output - at the core of it - it was a system’s choice