ChatGPT creates phisher’s paradise by serving wrong URLs
Submitted 3 days ago by cm0002@lemmy.world to cybersecurity@infosec.pub
https://www.theregister.com/2025/07/03/ai_phishing_websites/
Submitted 3 days ago by cm0002@lemmy.world to cybersecurity@infosec.pub
https://www.theregister.com/2025/07/03/ai_phishing_websites/
besselj@lemmy.ca 3 days ago
It’s called slopsquatting for hallucinated libraries that vibe coders use, but I figure it can be called the same for hallucinated URLs