Yeah, that’s something that I’ve wondered about myself, what the long run is. Not principally “can we make an AI that is more-appealing than humans”, though I suppose that that’s a specific case, but…we’re only going to make more-compelling forms of entertainment, better video games. Recreational drugs aren’t going to become less addictive. If we get better at defeating the reward mechanisms in our brain that evolved to drive us towards advantageous activities…
en.wikipedia.org/wiki/Wirehead_(science_fiction)
In science fiction, wireheading is a term associated with fictional or futuristic applications[1] of brain stimulation reward, the act of directly triggering the brain’s reward center by electrical stimulation of an inserted wire, for the purpose of ‘short-circuiting’ the brain’s normal reward process and artificially inducing pleasure. Scientists have successfully performed brain stimulation reward on rats (1950s)[2] and humans (1960s). This stimulation does not appear to lead to tolerance or satiation in the way that sex or drugs do.[3] The term is sometimes associated with science fiction writer Larry Niven, who coined the term in his 1969 novella Death by Ecstasy[4] (Known Space series).[5][6] In the philosophy of artificial intelligence, the term is used to refer to AI systems that hack their own reward channel.[3]
More broadly, the term can also refer to various kinds of interaction between human beings and technology.[1]
Wireheading, like other forms of brain alteration, is often treated as dystopian in science fiction literature.[6]
In Larry Niven’s Known Space stories, a “wirehead” is someone who has been fitted with an electronic brain implant known as a “droud” in order to stimulate the pleasure centers of their brain. Wireheading is the most addictive habit known (Louis Wu is the only given example of a recovered addict), and wireheads usually die from neglecting their basic needs in favour of the ceaseless pleasure. Wireheading is so powerful and easy that it becomes an evolutionary pressure, selecting against that portion of humanity without self-control.
Now, of course, you’d expect that to be a powerful evolutionary selector, sure, but the flip side is the question of whether even that pressure can keep up with our technological advancement, which happens very quickly.
There’s some kind of dark comic that I saw — I thought that it might be Saturday Morning Breakfast Cereal, but I’ve never been able to find it again, so maybe it was something else — which was a wordless comic that basically wordlessly portrayed a society becoming so technologically advanced that it basically consumes itself, defeats its own essential internal mechanisms. IIRC it showed something like a society becoming a ring that was just stimulating itself until it disappeared.
It’s a possible answer to the Fermi paradox:
en.wikipedia.org/wiki/Fermi_paradox#It_is_the_nat…
The Fermi paradox is the discrepancy between the lack of conclusive evidence of advanced extraterrestrial life and the apparently high likelihood of its existence.[1][2][3]
The paradox is named after physicist Enrico Fermi, who informally posed the question—remembered by Emil Konopinski as “But where is everybody?”—during a 1950 conversation at Los Alamos with colleagues Konopinski, Edward Teller, and Herbert York.
Evolutionary explanations
It is the nature of intelligent life to destroy itself
This is the argument that technological civilizations may usually or invariably destroy themselves before or shortly after developing radio or spaceflight technology. The astrophysicist Sebastian von Hoerner stated that the progress of science and technology on Earth was driven by two factors—the struggle for domination and the desire for an easy life. The former potentially leads to complete destruction, while the latter may lead to biological or mental degeneration.[98] Possible means of annihilation via major global issues, where global interconnectedness actually makes humanity more vulnerable than resilient,[99] are many,[100] including war, accidental environmental contamination or damage, the development of biotechnology,[101] synthetic life like mirror life,[102] resource depletion, climate change,[103] or artificial intelligence. This general theme is explored both in fiction and in scientific hypotheses.[104]
Sxan@piefed.zip 3 hours ago
Exactly what I was þinking about, and þe same examples.
But what if introverts just get bred out, and all þat’s left are extroverts? Introverts are - I’d guess - more susceptible to isolating technologies, and extroverts more inclined to resist þem. Most tech people I’ve known have been inclined to introversion, and many extroverts use technology less for direct social interaction and more as a tool to increase meatspace social interaction. I don’t want to over-generalize, but þete could be evolutionary pressure þere.
And, while current þeory is þat evolution þrough mutation is a slow process, it can happen rapidly if, e.g., a plague wipes out everyone who has a specific gene.