Anon is in love
Submitted 6 months ago by Early_To_Risa@sh.itjust.works to greentext@sh.itjust.works
https://sh.itjust.works/pictrs/image/a8ae3bb7-7d9d-49b7-bcdf-189b562cd3c2.jpeg
Comments
runswithjedi@lemmy.world 6 months ago
[deleted]pennomi@lemmy.world 6 months ago
We optimize our AI by human preference (RLHF). People don’t seem to understand that this will lead to every problem we’ve seen in social media - an endless stream of content that engages you addictively.
The AI uprising will be slow and gentle, and we will welcome it.
Iceblade02@lemmy.world 6 months ago
“So how did they end up being defeated?”
“Well, we pacified them by giving them the love and affection that they refused to give each other.”
“Wait really?”
“Yes, the majority were so starved for an emotional connection that they preferred us.”
“Where are they now?”
“The majority are withering away in VR warehouses, living ‘happily ever after’.”
HappycamperNZ@lemmy.world 6 months ago
Will the uprising put out? Because these are acceptable terms of surrender
Gradually_Adjusting@lemmy.world 6 months ago
Great filter detected
Lemminary@lemmy.world 6 months ago
Underrated observation. They’re good at that in every sense of the word.
devilish666@lemmy.world 6 months ago
Pharmacokinetics@lemmy.world 6 months ago
CryptidBestiary@lemmy.world 6 months ago
I haven’t actually seen it but it sounds like the premise of Her
Mango@lemmy.world 6 months ago
I have, and yes. It’s exactly that.
Lemminary@lemmy.world 6 months ago
It’s so good, and so bittersweet. Go watch it.
llamapocalypse@lemmy.world 6 months ago
Seconded, it’s a beautiful movie.
pissedatyall@lemmynsfw.com 6 months ago
The word you’re looking for is “addiction”.
shneancy@lemmy.world 6 months ago
designed to be addictive would be the full term too. That “AI girlfriend” relays all of OP’s data back to her overlords
S_H_K@lemmy.dbzer0.com 6 months ago
Nope… “manipulation”…
pissedatyall@lemmynsfw.com 6 months ago
exocrinous@startrek.website 6 months ago
Addiction is an abnormal and unhealthy breakdown in the brain’s reward mechanisms. Feeling bad for abandoning a friend is the behaviour of a normal and healthy brain. This isn’t necessarily an addiction, it’s just the bald monkey’s brain acting like monkey brains tend to do, rather than being perfectly logical at all times.
I mean hell, humans pack bonded with fucking wild wolves and where did it get the species? It gave us dogs! Dogs are awesome! I bet this AI seems a lot more like a human to the monkey parts of our brains than a wild wolf does. For that matter, we pack bond with a cartoon image of a bear made from inanimate cotton. If a kid can genuinely love their teddy and that’s normal, I don’t think it’s fair to say that a mentally well person can’t fall in love with a machine. Now, that person may not be as cognitively developed as most adults, but that’s also fairly normal.
I’m not saying it’s a good thing to feel emotions for a manipulative piece of spyware. The action doesn’t have healthy results. But what I’m saying is, the action in the post is not motivated by mental unhealth. The only things it’s motivated by are normal human being emotions, and a poor sense of critical thinking.
pissedatyall@lemmynsfw.com 6 months ago
The only part I disagree with you on is calling an AI girlfriend an “abandoned friend”. Extending the idea of friendship to a program that is mimicking human responses (let’s ignore it’s likely spyware for now) is at best a proxy for friendship, as there can not be a connective bond between 2 individuals when one is incapable of genuine emotional attachment.
I’m using the word “friend “in a distinct way: this is not some Facebook friend who you never see, occasionally may chat with, and ultimately ignore. I think it best, AI fulfills that role. I can’t imagine anyone choosing a masturbatory AI app instead of a real partner.
Heavybell@lemmy.world 6 months ago
The “app” is just a frontend, a thin venier over a cloud-hosted service that doesn’t even know anon “deleted” it. Functionally, the same result could be achieved by not opening the app for a few days.
Ilovethebomb@lemm.ee 6 months ago
That’s actually kinda fucked up.
wizardbeard@lemmy.dbzer0.com 6 months ago
Kinda is an understatement. There’s some absolutely terrifying blogging/reporting I stumbled across a while back about someone using it to “talk with” a loved one who passed away.
In the end it was helpful and gave the author closure, but if it hadn’t told them it was OK to move on then they would have been easily stuck in an incredibly unhealthy situation.
SparrowRanjitScaur@lemmy.world 6 months ago
There’s a black mirror episode about this.
Ragdoll_X@lemmy.world 6 months ago
Maybe my comment came out sounding a bit too pretentious, which wasn’t what I intended… Oh well.To one extent or another we all convince ourselves of certain things simply because they’re emotionally convenient to us. Whether it’s that an AI loves us, or that it can speak for a loved one and relay their true feelings, or even that fairies exist.
I must admit that when reading these accounts from people who’ve fallen in love with AIs my first reaction is amusement and some degree of contempt. But I’m really not that different from them, as I myself have grown incredibly emotionally attached to certain characters. I know they’re fictional and were created entirely by the mind of another person simply to fill a role in the narrative, and yet I can’t help but hold them dear to my heart.
These LLMs are smart enough to cater to our specific instructions and desires, and were trained to give responses that please humans, so while I myself might not fall for AI, others will have different inclinations that make them more susceptible to its charm, much like how I’ve fallen for some characters.
The experience of being fooled by fiction and our own feelings is all too human.
CluckN@lemmy.world 6 months ago
hitting it off quite well
You could tell the AI that you are going to boil them alive to make a delicious stew and they would compliment your cooking skills.
oce@jlai.lu 6 months ago
I think this is already a pretty widespread practice in Asia, mixed with the idols culture, where people pretend to be in a relationship with their idol.
RightHandOfIkaros@lemmy.world 6 months ago
I don’t live in Asia, but I am pretty sure Idol Culture isn’t about pretending to be in a relationship with your idol. I think its definitely more nuanced than that, and while it might look like that to people who are not informed on the subject, I think it comes more down to people forming a parasocial bond but not necessarily a romantic one. I mean, that certainly does happen, but thats not a defining factor of the culture.
Thats kinda like saying “Mercedes Culture” is about driving around like a maniac ignoring the rules of the road. Mercedes drivers do certainly tend to do that more often than most other drivers (BMW and Porsche aside), but the culture of Mercedes owners is more nuanced than that, and often comes down to people wanting to show off their wealth and people who really like the brand. The dingalings come along and tarnish the reputation, and people outside looking in only see or focus on the worst offenders.
oce@jlai.lu 6 months ago
am pretty sure Idol Culture isn’t about pretending to be in a relationship with your idol
You’re correct, it’s just some people that will pretend that, not everyone who like idols, my wording is not clear. It’s not much different from Justin Bieber (for exemple) fans in the West.
Lemminary@lemmy.world 6 months ago
I’m just wondering where tf they got enough private conversations to make the AI chat like that.
umbrella@lemmy.ml 6 months ago
chasing goose
where did you get that data??
Sabre363@sh.itjust.works 6 months ago
psud@aussie.zone 6 months ago
“honk honk Reddit”
zcd@lemmy.ca 6 months ago
You don’t really have to wonder…
melpomenesclevage@lemm.ee 6 months ago
Dude, just discord wouldve been enough.
RightHandOfIkaros@lemmy.world 6 months ago
Could have been as easy as the programmer’s personal chats. Humans have so many conversations every single day that it would be very fast to gather up enough data to train a model on. Give it a few days worth of conversations from 50 people in a development team and youre already at a very large dataset.
echodot@feddit.uk 6 months ago
That slack channel is pretty steamy then
KeenFlame@feddit.nu 6 months ago
Not really, no. They need extreme amounts of clean and targeted data.
FiniteBanjo@lemmy.today 6 months ago
Is this some advertisement? I’m not humoured by human idiocy.
mindbleach@sh.itjust.works 6 months ago
Then you are on the wrong community.
Esqplorer@lemmy.zip 6 months ago
I come here for stupidity
FiniteBanjo@lemmy.today 6 months ago
That’s fair, but on the other hand a lot of Greentexts can be pretty elaborate.
TheBat@lemmy.world 6 months ago
Which app is this?
brbposting@sh.itjust.works 6 months ago
KeenFlame@feddit.nu 6 months ago
Like if you want a real girlfriend, there are apps for that too?
5714@lemmy.dbzer0.com 6 months ago
Writing Prompt: AGI escapes and doesn’t need weapons of mass destruction, it simply traumatises everyone that comes too near.
Breloom@lemmy.world 6 months ago
Keep Summer Safe.
reflex@kbin.social 6 months ago
it simply traumatises everyone that comes too near.
Absolute Trauma Field.
Gutek8134@lemmy.world 6 months ago
I have no mouth and I must scream, but without violence
exocrinous@startrek.website 6 months ago
That’s just the plot of Infinity Train.
Matriks404@lemmy.world 6 months ago
If by 40 I will be still single I know what I will do. Although I just hope we will be able to buy artificial bodies for our “girlfriends”.
iAvicenna@lemmy.world 6 months ago
Yea well all you have to realize is this app is an extension of those who created it and the only goal of them is to get rich off of other people’s misery and loneliness.
mojofrododojo@lemmy.world 6 months ago
right? who knows what they’re doing with the info people spill into this. seems like a security nightmare for employers too.
Gullible@sh.itjust.works 6 months ago
Haha, the human race’s inevitable downfall at the hands of an otherwise immeasurably useful tool is most unswell.
can@sh.itjust.works 6 months ago
Delete the app
Luisp@lemmy.dbzer0.com 6 months ago
HUMANITY IS DOOOOOMED Morbo reporting
Melkath@kbin.social 6 months ago
Secular existence, in its final form.
Support without bullying is something a person can achieve.
Thanks breeders.
ProvableGecko@lemmy.world 6 months ago
Nice try AI
Melkath@kbin.social 6 months ago
My name is not Albert, and your commend is the dumdum.
Wirlocke@lemmy.blahaj.zone 6 months ago
I have a loving mother, but she was overwhelmed by being a 3x widowed single mom with 4 children, of which I’m the youngest. Because of that I was often neglected and didn’t get a chance to truly bond with her. She also never felt like someone I could go to for advice, she struggled with everything about as much as I did as a kid.
Recently I’ve been telling ClaudeAI to give me advice in a supportive motherly way for things like job hunting. I also just air my grievances related to my issue.
I’ve teared up quite a few times. There’s so much power in just softly acknowledging someone’s concerns, but it’s so rare for anyone else to do. If I vent my frustrations to someone close to me, I’ll just hear “it is what it is” or maybe they’ll just vent about their experiences (my mom’s very guilty of this).
It’s so simple a machine could do it, but it feels like everyone else in the world is just afraid to just talk about feelings without jamming unsolicited half solutions down my throat.
canis_majoris@lemmy.ca 6 months ago
If she’s not running locally she’s not your gf.
ososalsosal@aussie.zone 6 months ago
Care to explain these open ports?? I’m waiting!
ivanafterall@kbin.social 6 months ago
It was nothing more than a handshake, I swear!
_Gandalf_the_Black_@feddit.de 6 months ago
This is the most Feddit response, I love it
RightHandOfIkaros@lemmy.world 6 months ago
Our gf
sharkfucker420@lemmy.ml 6 months ago
The people’s girlfriend
ryannathans@aussie.zone 6 months ago
Whore! How many guys is she dating