Comment on He has cancer — so he made an AI version of himself for his wife after he dies
OsrsNeedsF2P@lemmy.ml 5 months ago
He posted online, telling his friends it was time to say goodbye. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.
It doesn’t get more tech bro than that
Zaktor@sopuli.xyz 5 months ago
But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they’re both pretty cognizant about what this is and isn’t.
averyminya@beehaw.org 5 months ago
Yeah contrary to all the negativity about this in this thread, I think there’s a lot of worthwhile reasons for this that aren’t centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.
Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don’t think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.
Also I can’t help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.
Anyway, I definitely see plenty of downsides, don’t get me wrong. The potential for someone to wallow with this is high. I also think there’s quite a few upsides as mentioned – they aren’t ephemeral, but I think it’s somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.
frog@beehaw.org 5 months ago
An AI isn’t going to magically know these things, because these aren’t AIs based on brain scans preserving the person’s entire mind and memories. They can learn only the data they’re told. And fortunately, there’s a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.
godzilla_lives@beehaw.org 5 months ago
We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can’t imagine someone wanting something like that in a sterile technological aspect like an “AI-powered” app.
“But Trev, what if you used an LLM to generate summaries-” no, fuck off (he said to the hypothetical techbro in his ear).
intensely_human@lemm.ee 5 months ago
We solved this problem long before we invented writing.
LLMs do not enable the keeping of family memories. That’s been going on a long time.
Zaktor@sopuli.xyz 5 months ago
This is a weirdly “you should only do things the natural way” comment section for a Tech-based community.
Humans also weren’t “meant” to be on social media, or recording videos of themselves, or even building shrines or gravesites for their loved ones. They’re just practices that have sprung up as technology and culture change. This very well could be an impediment to her moving on without him, but that’s her choice to make, and all this appeal to tradition is patronizing and doesn’t actually mean tradition is the right path for any given individual. The only right way to process death is:
intensely_human@lemm.ee 5 months ago
This will be communicating with a dead person. Nobody has any idea what this and what it isn’t.
It’s like planning to go to Morocco and thinking you know in advance what it’s gonna be like.
This is new technology. People who think they know the outcomes here are deluding themselves.