ChatGPT creates phisher’s paradise by serving wrong URLs
Submitted 1 month ago by cm0002@lemmy.world to cybersecurity@infosec.pub
https://www.theregister.com/2025/07/03/ai_phishing_websites/
Submitted 1 month ago by cm0002@lemmy.world to cybersecurity@infosec.pub
https://www.theregister.com/2025/07/03/ai_phishing_websites/
besselj@lemmy.ca 1 month ago
It’s called slopsquatting for hallucinated libraries that vibe coders use, but I figure it can be called the same for hallucinated URLs