kayohtie
@kayohtie@pawb.social
- Comment on Is it just me or does this look more appetizing than a watermelon? 15 hours ago:
If it tasted like a kiwi I’d be all over it. Ripe kiwi are amazing.
- Comment on It's Not Being Anti-Abortion. It's Being Poor Choice 15 hours ago:
“I’m pro-life”
looks inside
Dead
- Comment on Chatbots can be manipulated through flattery and peer pressure 1 week ago:
Birds string together words they hear when they can repeat them, and end up with the short phrases they seem to make. It’s extremely rare for them to actually understand meaning, most often it’s simply association which is why you often get nonsensical responses that still sort of make sense, or sounds out of them that sound like words but just…aren’t. The simalcrum of language without containing any. Often words can be linked by that, and our own brain wants to find words and sometimes we decipher ones, similar to seeing shapes in noise – we just tend to realize that it’s actually just recognition, not real.
What’s happening here is the equivalent of recording a bird and playing back it’s recording to itself to get a new response, as a chain. It’s predictive text feeding itself, in a simplistic but not inaccurate manner given how language models actually work at a technical level, tokenizing the input to train and create matrices of language vectors that contain word fragments, and often loop back on themselves or into yet more matrices of options. This is the “beam size” option some models have when run, selecting how many search routines should be created simultaneously for things that map to probability values that make sense.
Our own reasoning is far more complicated. Sometimes we think it’s just words, but our brain will seamlessly weave inner monologue into concepts and imagery or ideas without text and back again, sometimes into sounds or other things. We stitch together everything so seamlessly because it all actually has meaning for us.
LLMs having “reasoning” at all is operating by the Sapir-whorf hypothesis, which would imply there is no reasoning without language. And even animals can fucking reason without language. We absolutely did too. Sapir-whorf was an infantile thought experiment turned theory of language that’s been patently proven wrong even when it makes for great sci-fi (see Arrival).
This isn’t the difference between hearing a song live and played aloud, or midi/samples vs instruments. This is that part of our consciousness operates in some absolute wild ways that we can still only classify at a high level because the complexities are so far beyond what we can describe with models that, by comparison, are simplistic as hell.
Put another way, without transcriptions of “that’s right, the square hole”, if you showed two photos to a model and asked “where does this piece go” it’s just going to “see” the shape in both, recognize the image->word mapping and come up with a response fitting that, without ever being able to “realize” it can go into the square hole without being prompted, because it can’t invent.
Only parrot.
- Comment on FFmpeg 8 can subtitle your videos on the fly with Whisper 1 week ago:
But much more loosely-so, not nearly as heavily. It was more like a seldom-used term to say that it might be like what machine learning actually was.
Now they’re all being called it heavily, forcefully, by corporations which started using it for capitalistic hype reasons. Hence, the push for strong distinctions between a field that’s been around for quite a while as algorithms in mathematics being a variety of types, and lazy slop that “just one more prompt bro” and “we can replace workers”. Even DLSS wasn’t called “AI” until the hype train started, and now Jensen Huang can’t call it that often enough lest he be unable to afford yet another leather jacket as if they’re disposable glasses wipes.
- Comment on FFmpeg 8 can subtitle your videos on the fly with Whisper 1 week ago:
It’s not AI, it’s neural network models in the same way voice recognition in devices has been working for over a decade. Even Dragon has been utilizing language models vectors for a very long time, just requiring voice training instead of utilizing a premade research or open-source data set.
I hate generative AI and it’s slop too, but getting angry about neural network models in general is not only absurd, but playing exactly into what corporations want – conflation of the underlying basic technology concepts with the capitalistic vampirism of art.
- Comment on Shit like this is why we need open source printers! 3 weeks ago:
TIL! Thank you for the added detail, I hadn’t read the full write up but had watched his presentation in English and it was wild to hear presented.
- Comment on Shit like this is why we need open source printers! 3 weeks ago:
You left out what I feel is the best part: even in the “uncompressed” mode, even when that was disabled, it was still happening sometimes.
- Comment on Florida ounces 4 weeks ago:
Ain’t called the “imperial” system for nothing.
Brits gave it to us and then we changed the spellings and decided it made us special I guess.
So, partial credit where partial credit is due. It’s just way the fuck older.
- Comment on After laying off 9,000 employees , Microsoft records $27.2 billion profit in latest quarter 1 month ago:
I’m getting the same vibes as when I spike taxes in Sim City for Palm OS just before the year rolls over and then drop it back after.
- Comment on Where are all the successful "red cities"? 1 month ago:
As someone living in a red state, every city here is referred to as a “blue city”, whether people are moving to it or not. State color has shit all to do with it and tends to be how much did we make the assumption land votes.
- Comment on Eksbawks 1 month ago:
I could just remap in the emulator too, just hadn’t felt up to such.
- Comment on I 🖤 LaTeX 1 month ago:
That’s how you can tell if someone is into latex (kink), they don’t feel comfortable calling LaTeX (tech) by the same pronunciation around people.
- Comment on Real milk proteins, no cows: Engineered bacteria pave the way for vegan cheese and yogurt 1 month ago:
A local friend has the casein allergy. It sucks, cause there’s a lot of baked goods and food in general she can’t have. I struggle sometimes in baking becaus even butter contains it, and plant butters never quite perfectly match with real butter.
At least plant-based heavy cream works well enough. Lacks the dairy taste but it still works and whips nicely even in a cream whipper.
- Comment on Eksbawks 1 month ago:
I find myself having to use my Switch pro controller when playing Wii U games on PC, because if I use my Xbox controller I get fucked up on button locations. But somehow holding the slightly different controller shape and button size makes my muscle memory switch for it cleanly.
- Comment on 7,818 titles on Steam disclose generative AI usage, or 7% of Steam's total library of 114,126 games, up from ~1,000 titles in April 2024 1 month ago:
Open-source training texts intended for pairing with your intended style of output have been around for far longer than OpenAI has been grifting data from the entire Internet and collected book works. It came across like that’s what they’re using, not some shit off HuffingFarce that was built off of AO3 and Harry Potter.
- Comment on [deleted] 1 month ago:
This is the reality of what sapir-whorf was guessing at. The way it’s defined is incorrect IIRC, but the real heart of it I think stemmed from this kind of reality of distinction.
The fact people think it’s normal and don’t realize it’s not, especially once they get older simply being unwilling to think otherwise…yeah.