Leave it to Musk to make Sam Altman seem like a reasonable adult.
Elon Musk and Sam Altman clashed on X after Musk shared a post about a man who committed a murder-suicide following delusional conversations with ChatGPT
Submitted 3 weeks ago by Beep@lemmus.org to technology@beehaw.org
https://lemmus.org/pictrs/image/b08b254d-b678-4cc9-861b-17c169622ebd.webp
Comments
LibertyLizard@slrpnk.net 3 weeks ago
djsaskdja@reddthat.com 3 weeks ago
Altman is just Musk with better PR.
anachronist@midwest.social 3 weeks ago
Elon had similar PR until he fired them for not “keeping it real.”
LuckingFurker@lemmy.blahaj.zone 3 weeks ago
tal@lemmy.today 3 weeks ago
Never wrestle with a pig. You’ll just get dirty, and the pig enjoys it.
muhyb@programming.dev 3 weeks ago
Doesn’t matter since it’s pig vs pig in a mud championship.
Vodulas@beehaw.org 3 weeks ago
Altman already has plenty of mud on him.
Kissaki@beehaw.org 2 weeks ago
When the pig has a public platform it can be important to respond. Not for the other, but for the audience or community. Otherwise, public discourse becomes dominated by pigs.
(Staying in your analogy. I think pigs are better than that.)
doleo@lemmy.one 3 weeks ago
Also gork is used by the department of War. I wonder what they get up to.
Midnitte@beehaw.org 3 weeks ago
Hackworth@piefed.ca 3 weeks ago
Juice@midwest.social 2 weeks ago
ChatGPT will fail, Altman will take the fall, Microsoft will get the whole thing, and nothing negative will be avoided.
prole@lemmy.blahaj.zone 2 weeks ago
almost a billion people use it
Doubt
irelephant@lemmy.dbzer0.com 2 weeks ago
I don’t 😔. It is really popular irl.
Not_mikey@lemmy.dbzer0.com 3 weeks ago
Surprised grok hasn’t been tied to any deaths yet. I guess that’s one good thing about attaching it to a social media, someone can see your chat and tell you that drinking Clorox isn’t going to teleport you to an alien paradise.
Archangel1313@lemmy.ca 3 weeks ago
Meanwhile…grok is busy posting CSAM all over twitter.
Novocirab@feddit.org 3 weeks ago
It’s almost as if Elmo wishes people to talk about something else.
nymnympseudonym@piefed.social 3 weeks ago
More than time for regulation. LLMs can generally detect when a convo has gone delusional, psychotic, or towards self harm. Those should at minimum be logged and terminated