uuldika
@uuldika@lemmy.ml
- Comment on Everything is a problem 1 day ago:
No boot sequences
(being annoyingly pedantic) technically there is a boot sequence: the Gameboy logo. on the DMG there’s a little blob of code from 0x0000 to 0x00ff that clears some memory, sets up the screen, reads the logo from cartridge memory and scrolls it. the loader only jumps to the game if the logo is byte-identical (the idea being that unlicensed games could be sued for trademark infringement.)
on the GBC the loader is a little beefier but mostly the same.
t. made a horribly broken FPGA core for the GBC that got just far enough to load the Tetris intro
- Comment on Everything is a problem 1 day ago:
everything is still a problem, it’s just a self-inflicted problem! 😁
- Comment on We'd like to welcome our newest Student to Hogwarts, Hun-Gary Mc'Spud. 2 weeks ago:
“Sir” is in her first name, and her last name is also a boy name.
- Comment on We'd like to welcome our newest Student to Hogwarts, Hun-Gary Mc'Spud. 2 weeks ago:
she actually did that for the “trans” character in the Hogwarts game.
Sirona Ryan. I shit you not.
- Comment on We'd like to welcome our newest Student to Hogwarts, Hun-Gary Mc'Spud. 2 weeks ago:
Graham Cracker.
- Comment on Maybe someday 😌 2 weeks ago:
fuck Putin, slava Ukraini!
- Comment on Maybe someday 😌 2 weeks ago:
I definitely didn’t know how heated people were towards .ml when I joined. istg I’m not a tankie.
- Comment on Who turned on friendly fire? 3 weeks ago:
ngl kinda glad Trump banned trans people from the military. I can’t get drafted! 🥲
- Comment on Winner, winner, chicken dinner 3 weeks ago:
save it up and you can spin it into yarn. great for making mittens, hats, scarfs and baby sweaters.
- Comment on KM3-230213A 4 weeks ago:
the neutrino was estimated to at 220 PeV. goodness!
- Comment on Finally paid off my Costco hotdog 🙏 4 weeks ago:
the amount of consumer goodwill that statement bought easily pays for any losses from the hotdogs. companies seem hellbent on torching their brand reputation for short-term gains these days.
- Comment on just p-hack it. 4 weeks ago:
if they existed they’d be killer for RL. RL is insanely unstable when the distribution shifts as the policy starts exploring different parts of the state space. you’d think there’d be some clean approach to learning P(Xs|Ys) that can handle continuous shift of the Ys distribution in the training data, but there doesn’t seem to be. just replay buffers and other kludges.
- Comment on So um, america just started another war in the middle east. We're going to need a shit ton more memes to distract americans from the nightmare they are enduring. Thanks in advance... 4 weeks ago:
I have no clue what to do. I’m American. I’m trans. I voted for Harris. I live in a blue state. I’ve gone to protests. I’ve talked to state senators about legislation. I’ve canvassed. I’ve talked to Trump supporters and tried to get them to change their minds.
it all feels totally useless. nothing I’ve done has had any effect. it’s like being chained to an anvil and thrown overboard. I’m swimming but I’m still drowning.
- Comment on A cuppa Jill 4 weeks ago:
I’ll just have a cup of rain, thanks.
- Comment on A Completely Natural Conversation in the NYC Reddit 4 weeks ago:
I actually don’t. :) it’s the first thing I disable on my phone. I only capitalize “I” specifically, and proper nouns.
again though, my issue is one with producing the Unicode character of the specific em-dash glyph. maybe some phones automatically convert dashes to em-dashes based on context but I personally wouldn’t know how it’s done.
- Comment on A Completely Natural Conversation in the NYC Reddit 4 weeks ago:
no no, you misunderstand. I’m not talking about the verbal flourish of dashes for interjection - I use those all the time, in this very sentence - I’m specifically talking about producing the specific Unicode character (—) instead of just using the normal ASCII dash (-). the only way I can make the actual em-dash character is by long-pressing. if I do – or —, it’s just a sequence of normal dash glyphs for me.
- Comment on A Completely Natural Conversation in the NYC Reddit 4 weeks ago:
I buy all my groceries from Omega Mart, like a normal person.
- Comment on A Completely Natural Conversation in the NYC Reddit 4 weeks ago:
I use dashes all the time, but em-dashes? I don’t even know how to type those. I guess I could long-press the dash on my phone and select it, but… why?
- Comment on wtf i love capitalism now 5 weeks ago:
love is DLC.
- Comment on [deleted] 5 weeks ago:
I thought the taste bud distribution thing ended up being kinda bad science.
- Comment on Top 10 Most Realistic Photoshops... 5 weeks ago:
it could be a real cat sitting inside a miniature model Italian alley. that was where my thoughts went.
- Comment on Alternatively 5 weeks ago:
She’s a transphobe, disappointingly:
- Comment on Alternatively 5 weeks ago:
I, for one, enjoy not living in a Puritan theocracy.
- Comment on Alternatively 5 weeks ago:
it was an art piece, not a serious product idea. they weren’t pitching it to Trojan, all it purported to do was make a “statement.”
it’s like an OmegaMart product, basically.
- Comment on Anon is the Sandwich Man 1 month ago:
if only all the greentexts were so wholesome as this.
- Comment on What grass starvation does to the perma-online 1 month ago:
I’ve noticed a lot of infantile and absurdly maximalist takes on Lemmy lately. it’s kind of souring me on the project.
- Comment on Looking for the perfect 5 year anniversary gift? 1 month ago:
this really is a model/engine issue though. the Google Search model is unusably weak because it’s designed to run trillions of times per day in milliseconds. even still, endless repetition this egregious usually means mathematical problems happened somewhere, like the SolidGoldMagikarp incident.
think of it this way: language models are trained to find the most likely completion of text. answers like “you should eat 6-8 spiders per day for a healthy diet” are (superficially) likely - there’s a lot of text on the Internet with that pattern. clanging like “a set of knives, a set of knives, …” isn’t likely, mathematically.
last year there was an incident where ChatGPT went haywire. small numerical errors in the computations would snowball, so after a few coherent sentences the model would start sundowning - clanging and rambling and responding with word salad. the problem in that case was bad cuda kernels. I assume this is something similar, either from bad code or a consequence of whatever evaluation shortcuts they’re taking.
- Comment on The audacity 1 month ago:
you’d think a public healthcare system would figure out that treating conditions early is actually cheaper.
- Comment on Game files are verified, House 1 month ago:
that’s disappointing. I didn’t know about that.
- Comment on Game files are verified, House 1 month ago:
even the experienced ones! do you ever watch Linus Tech Tips? it’s so fun watching them rip up the water-cooling loops after realizing they put it together wrong, or trying and failing at different EFI hacks to get weird prototype cards to work or figure out why their mystery hardware isn’t booting.