cross-posted from: programming.dev/post/36251461
Comments
- Hackernews.
Source: zack_overflow on X/Twitter.
Submitted 19 hours ago by Pro@programming.dev to technology@beehaw.org
https://i.imgur.com/Wc7Tbtk.jpeg
cross-posted from: programming.dev/post/36251461
Comments
- Hackernews.
Source: zack_overflow on X/Twitter.
I didn’t know this existed, but it’s obviously a bad idea
I really don’t understand why they just put LLMs in direct control of stuff and also reading the public internet without any kind of sandboxing, you’d think this concern would be the main design problem that needs to be worked around.
complete insanity that the browser/agent doesnt even ask for user confirmation before interpreting web pages as instructions. this is just AI XSS, just mental that the AI is configured to trust and execute instructions from unsanitized web content. how was this not one of the first problems raised during development prior to release?
LLMs fundamentally don’t/can’t have “sanitized” or “unsanitized” content - it’s all just tokens in the end. “Prompt Injection” is even a bit too generous of a term, I think.
sure but one would hope that if the agent is interpreting content from the web as instructions that there would be literally any security measure between the webpage and the agent - whether that’s some input sanitization, explicit user confirmation, or prohibiting the agent from interpreting web pages as instructions at all.
So … is this a bug or a feature?
And drains our freshwater reserves in order to do it.
The dumbest timeline.
They can drain all 0 of my dollars.
I worked bank customer service. They will typically allow several transactions “in good faith.” You can dispute them, but there is a chance that the transaction type cannot be refunded easily.
I often saw accounts go from $20 to -$600 due to overdraft fees, fees for being overdrawn for an extended time, etc. It is a major interruption to your life in these situations.
I specifically have an account that does not let you overdraft it. If the transaction would go even 1 cent over what’s in there, it denies it.
I had to protect me from myself 😔
CarbonIceDragon@pawb.social 19 hours ago
after years and many billions of dollars of technological development, we have finally invented a machine that can be scammed