Comment on Google Says AI Could Break Reality
niucllos@lemm.ee 4 months agoSure, but so do a lot of other things that aren’t as costly. If NFTs were the first secure way to authenticate things online we wouldn’t have had online banking until very recently
Imacat@lemmy.dbzer0.com 4 months ago
True but trust is hard to establish in decentralized platforms like the fediverse. As far as I’m aware the only decentralized banking is unfortunately cryptocurrency.
itslilith@lemmy.blahaj.zone 4 months ago
What NFTs (and crypto in general) do is very different from a web of trust style approach
Crypto creates one source of absolute truth, the Blockchain, costly computed via consensus.
Web of trust, on the other hand, requires you to declare which accounts you trust. Via public-private key signing, you can always verify that a post is actually made by a specific person, and if you trust that person (e.g. because you’ve met them before and exchanged keys), you know it’s legit. You can then extend that system by also trusting accounts your trusted accounts verified, etc
intensely_human@lemm.ee 4 months ago
We need to get a lot better about this kind of thing now that the cost of generating fake but structurally believable content/information has dropped.
Web of trust has always seemed like it’s for geeks so far. We need to enter a new phase of our cultural history, where competent knowledge of cryptographic games is commonplace.
Either that, or the geeks need to figure a way to preserve civilization link monks in the dark ages, trading accurate science and news among their tiny networks, while the majority of insecure networks are awash in AI-generated psyops/propaganda/scamspeak.
Or, we might get lucky and AI are inherently more ethical as they get more intelligent, as a rule of nature or something.
It’s nice to imagine speech, in general, being a natural environment the human brain is evolutionarily adapted to. And speech among other humans is an environment we’re adapted to. We implicitly assume certain limitations in people’s ability to spin bullshit while keeping it error-free, for instance, so we have an instinct to trust more as we hear more of what a person is saying. We trust longer stories more, and we trust people the longer we know them.
But AI, even if it’s not fundamentally different than humans - ie even if it’s still bounded by the rules of generating bullshit vs just reporting the truth - can still get outside our natural detection systems just by being ten times faster.
I guess what I’m saying is this is like that moment in the Cambrian or whatever when all the oxygen got released, and most of the life just got fucked and that was the end of their story. Just because a niche has been stable for a long time doesn’t mean it’s always going to be there.
Like, imagine a sci fi story about the entire atmosphere being stripped off of Earth, and the subsequent struggle for survival. How it would alter humanity’s history fundamentally, even if we survived, and even if we got the atmosphere back the human culture we knew would be gone.
That’s the level of event we’re facing. We’re in a sci fi story where the air is turning off and we all need to learn to live in vacuum and the only things we get to keep are the parts we can transform into airtight containers.
It might be that way right now, but instead of airtight it’s cryptographically-secure enclaves of knowledge and culture that will survive through the now presumably-endless period of history called “Airless Earth”.
Like having the atmosphere was the intro level of the game. Like in Far Cry 2, when you go to the second area, and it’s drier and more barren and there’s less ammo and cover and now they have roadblocks.
Our era of instinctively-navigable information is over. We’re all in denial because the atmosphere doesn’t go away, so we can’t deal with it, so it can’t be happening, so it’s not happening. But soon the denial won’t be possible any more.