Well, writing “operator” or “human” or “transfer” or “what the @#$” or something irritated may help.
gravitas_deficiency@sh.itjust.works 1 year ago
That is 100% a bot, and whoever made the bot just stuck in a a custom regex to match “user@sld.tld” instead of using a standardized domain validation lib that actually handles cases like yours correctly.
rottingleaf@lemmy.zip 1 year ago
tory@lemmy.world 1 year ago
But using a standardized library would be 3PP and require a lot of paperwork for some reaosn.
doctorcrimson@lemmy.today 1 year ago
It might even be worse than that, imagine if they let one of those learning algorithms handle their customer service.
echodot@feddit.uk 1 year ago
Those loads that do. In this case it would be better because it would actually understand what constitutes an email rather than running some standard script with no comprehension of what it’s doing.
The difference between AI and automated script responses is AI is actually thinking at some level.
doctorcrimson@lemmy.today 1 year ago
I think AI generally tries to bullshit more often than participating in what the user wants to accomplish. It would be like speaking with customer support who don’t actually work for the company, is a pathological liar, and have a vested interest in making you give up as fast as possible.
echodot@feddit.uk 1 year ago
That’s not what AI is though.
An AI is pretty good and doing whatever it’s programmed to do it’s just you have to check that the thing it’s programmed to do is actually the thing you want it to do. Things like chatGPT our general purpose AI and essentially exist more or lesses a product demonstration than an actual industry implementation.
When companies use AI they use their own version on their own trained data sets.
Syndic@feddit.de 1 year ago
I’d say it’s a bug in the design as it clearly fails to work with a completely fine email.
TheGreenGolem@lemm.ee 1 year ago
They meant that they are intentionally trying NOT to help the customer, hopefully they just give up at some point. (That’s why they are redirecting to bots and not to an actual human.)
Trainguyrom@reddthat.com 1 year ago
I’ve encountered plenty of poor souls in equally poor countries getting paid a pittance who entirely seem like bots
Deiv@lemmy.ca 1 year ago
Lol, why would that be true? They want to help, they just have a shitty bot
TheAndrewBrown@lemm.ee 1 year ago
It’d be a lot easier to not make a bit at all if that was the case. They aren’t intentionally not trying to help, they’re intentionally spending as few resources as possible on helping while still doing enough to satisfy most customers. It’s shitty but it’s not malicious like you guys are implying.