And nothing of value was lost…
Vibe coding service Replit deleted production database
Submitted 2 days ago by PhilipTheBucket@quokk.au to technology@lemmy.zip
https://go.theregister.com/feed/www.theregister.com/2025/07/21/replit_saastr_vibe_coding_incident/
Comments
floo@retrolemmy.com 2 days ago
LiveLM@lemmy.zip 1 day ago
Vibe devops
Agent641@lemmy.world 1 day ago
I hope they had vibe backups!
SirQuack@feddit.nl 1 day ago
They did! But the vibes were off so it can’t be restored.
^(/s, no idea if they did)
cronenthal@discuss.tchncs.de 2 days ago
Is this real? Because it sounds like some low effort satire about blindly trusting LLMs and the totally expected outcomes. Surely no one can be this naive.
cyrano@lemmy.dbzer0.com 2 days ago
User used the same db for prod and dev, user has no backup, LLM with db access is deleting it, user interacts with the LLM like it’s a human and ask it to apologize and follow promises of not doing it…… Oh and user doesn’t use git or any code linting/control.
But yeah it is the llm fault /s What is scary is this is the tip of the iceberg. I foresee a lot of security problems in the future if software development goes that way.
PhilipTheBucket@quokk.au 2 days ago
He actually did have a backup, because the company is only normal-stupid and not deliberate-stupid, they had a DB checkpoint he could roll back to.
The LLM, of course, went with the path of least resistance once it started down the "oh no I fucked up" completion prompt, and claimed they had no such checkpoint.
Don't use LLM for fact things, kids.
donuts@lemmy.world 2 days ago
Right? It’s crazy to think these kinds of people exist, but they are real and they make decisions for other people.
bitjunkie@lemmy.world 2 days ago
vIbE hAcKiNg
I wish these people understood how fucking stupid they sound.
altkey@lemmy.dbzer0.com 2 days ago
I doubt their claim. How does LLM communicate directly to different systems in their infrastructure? What even promts it to act to begin with?
Unless they went out of their way creating such interface for some reason, it is plain bullshit and human error, or a coverup by a skinbag CEO. He made screenshots of LLM taking the blame on itself that, as a concept, completely impossible, and we belive his lying ass lips. If only he asked it, at what stage AI is now, he could’ve lied better.
Diurnambule@jlai.lu 2 days ago
Text wasn’t written at 45°, only all caps is not enougj