Peanutbjelly
@Peanutbjelly@sopuli.xyz
- Comment on 🐇 🐇 🐇 4 days ago:
Bayesian analysis of complex intelligent systems via friston’s free energy principle and active inference? Or machine learning?
Personally love the stuff circling Michael Levin at tufts university. I could also imagine there’s a lot of unique model building in different biological/ecological niches.
- Comment on Microsoft CEO Admits That AI Is Generating Basically No Value 1 week ago:
i think it’s a framing issue, and AI development is catching a lot of flak for the general failures of our current socio-economic hierarchy. also people having been shouting “super intelligence or bust” for decades now. i just keep watching it get better much more quickly than most people’s estimates, and understand the implications of it. i do appreciate discouraging idiot business people from shunting AI into everything that doesn’t need it, because buzzword or they can use it to exploit something. some likely just used it as an excuse to fire people, but again, that’s not actually the AI’s fault. that is this shitty system. i guess my issue is people keep framing this as “AI bad” instead of “corpos bad”
if the loom was never invented, we would still live in an oppressive society sliding towards fascism. people tend to miss the forest for the trees when looking at tech tools politically. also people are blind to the environment, which is often more important than the thing itself. and the loom is still useful.
compression and polysemy growing your dimensions of understanding in a high dimensional environment, which is also changing shape, comprehension growing with the erasure of your blindspots. collective intelligence (and how diversity helps cover more blindspots) predictive processing (and how we should embrace lack of confidence, but understand the strength of proper weighting for predictions, even when a single blindspot can shift the entire landscape, making no framework flawless or perfectly reliable.) and understanding how everything we know is just the best map of the territory we’ve figured out so far. if you want to know judge how subtle but in our face blindspots can be, look up how to test your literal blindspot, you just need 30 seconds a paper with two small dots to see how blind we are to our blindspots. etc.
more than fighting the new tools we can use, we need to claim them, and the rest of the world, away from those who ensure that all tools will only exist to exploit us.
am i shouting to the void? wasting the breath of my digits? will humanity ever learn to stop acting like dumb angry monkeys?
- Comment on Microsoft CEO Admits That AI Is Generating Basically No Value 1 week ago:
let’s make another article completely misrepresenting opinions/trajectories and the general state of things, because we know it’ll sell and it will get the ignorant fighting with those who actually have an idea of what’s going on, because they saw in an article that AI was eating the pets.
please seek media sources that actually seek to inform rather than provoke or instigate confusion or division through misrepresentation and disinformation.
these days you can’t even try to fix a category error introduced by the media without getting cussed out and blocked from congregate sites because you ‘support the evil thing’ that the article said was evil, and everyone in the group hates, without even an attempt to understand the context, or what part of the thing is even being discussed.
also, can we talk more about breaking up the big companies so they don’t have a hold on the technology, rather than getting mad at everyone who interacts with modern technology?
legit ss bad feels like fighting rightwing misinformation about migrant workers and trans people.
just make people mad, and teach them that communication is a waste of energy.
we need to learn how to tell who is informing rather than obfuscating, through historicity of accuracy, and consensus with other experts from diverse perspectives. not building tribes upon who agrees with us. and don’t blame experts for not also learning how to apply a novel and virtually impossible level of compression when explaining their complex expertise, when you don’t even want to learn a word or concept. it’s like being asked to describe how cameras work, and then getting called an idiot when some analogy used can be imagined in a less useful context that doesn’t apply 1:1 with the complex subject being summarized.outside of that, find better sources of information. fuck this communication disabling ragebait.
cause now just having a history of rebuking this garbage gets you dismissed, because a history of interacting with the topic on this platform is a good enough vibe check to just not attempt understanding and interaction.
TLDR: the quality of the articles and conversation on this subject are so generally ill-informed that it hurts, and obviously trying to craft environments of angry engagement rather than informing.
also i wonder if anyone will actually engage with this topic rather than get angry, cuss me out, and not hear a single thing being communicated.
- Comment on Nick Clegg says asking artists for use permission would ‘kill’ the AI industry 4 weeks ago:
Or maybe the solution is in dissolving the socio-economic class hierarchy, which can only exist as an epistemic paperclip maximizer. Rather than also kneecapping useful technology.
I feel much of the critique and repulsion comes from people without much knowledge of either art/art history, or AI. Nor even the problems and history of socio-economic policies.
Monkeys just want to be angry and throw poop at the things they don’t understand. No conversation, no nuance, and no understanding of how such behaviours roll out the red carpet for continued ‘elite’ abuses that shape our every aspect of life.
The revulsion is justified, but misdirected. Stop blaming technology for the problems of the system, and start going after the system that is the problem.
- Comment on ChatGPT is shifting rightwards politically 2 months ago:
That argument was to be had with Apple twenty years ago as they built their walled garden, which intentionally frustrates people into going all in apple. Still can’t get anyone to care about dark patters/deceptive design, or disney attacking the creative Commons which it parasitically grew or of. AI isn’t and has never been the real issue. It’s just absorbs all the hate the corpos should be getting as they use it, along with every other tool at their disposal, to slowly fuck us into subservience. Honestly, AI is teaching us the importance of diverse perspectives in intelligent systems, and the dangers of overfitting, which exist in our own brains and social/economic systems.
Same issue, different social ecosystem being hoarded by the wealthy.
- Comment on Facial Recognition That Tracks Suspicious Friendliness Is Coming to a Store Near You 7 months ago:
Big. Fan of ai stuff. Not a fan of this. This definitely won’t have issues with minority and disability falling outside of distribution and causing false positives that enable more harassment of people who already get unfairly harassed.
Let this die with the mind reading tactics they spawned from.
- Comment on IMF predicts AI will impact 40% of jobs worldwide, exacerbating inequality and social tensions 1 year ago:
The main issue though is the economic system, not the technology.
My hope is that it shakes things up fast enough that they can’t boil the frog, and something actually changes.
Having capable AI is a more blatantly valid excuse to demand a change in economic balance and redistribution. The only alternative would be destroy all technology and return to monkey. Id rather we just fix the system so that technological advancements don’t seem negative because the wealthy have already hoarded all new gains of every new technology for this past handful of decades.
Such power is discretely weaponized through propaganda, influencing, and economic reorganizing to ensure the equilibrium stays until the world is burned to ash, in sacrifice to the lifestyle of the confidently selfish.
I mean, we could have just rejected the loom. I don’t think we’d actually be better off, but I believe some of the technological gain should have been less hoardable by existing elite. Almost like they used wealth to prevent any gains from slipping away to the poor. Fixing the issue before it was this bad was the proper answer. Now people don’t even want to consider that option, or say it’s too difficult so we should just destroy the loom.
There is a markov blanket around the perpetuating lifestyle of modern aristocrats, obviously capable of surviving every perturbation. every gain as a society has made that reality more true entirely due to the direction of where new power is distributed. People are afraid of AI turning into a paperclip maximizer, but that’s already what happened to our abstracted social reality. Maximums being maximized and minimums being minimized in the complex chaotic system of billions of people leads to inevitable increase to accumulation of power and wealth wherever it has already been gathered. Unless we can dissolve the political and social barrier maintaining this trend, it we will be stuck with our suffering regardless of whether we develop new technology or don’t.
Although doesn’t really matter where you are or what system you’re in right now. Odds are there is a set of rich asshole’s working as hard as possible to see you are kept from any piece of the pie that would destabilize the status quo.
I’m hoping AI is drastic enough that the actual problem isn’t ignored.