CeeBee_Eh
@CeeBee_Eh@lemmy.world
- Comment on So begins the great smart bulb saga! 1 week ago:
Home assistant is the only/best option
- Comment on Linux hits exactly 2% user share on the October 2024 Steam Survey 1 week ago:
Dunno. I don’t live there.
- Comment on Linux hits exactly 2% user share on the October 2024 Steam Survey 2 weeks ago:
Over 15% marketshare in India
- Comment on Linux hits exactly 2% user share on the October 2024 Steam Survey 2 weeks ago:
~35 million concurrent active users.
- Comment on Gearbox founder says Epic Games Store hopes were “misplaced or overly optimistic” 2 months ago:
Bad argument.
It would hold water if their solution was proprietary and closed source. But it isn’t, and anyone else, literally anyone, can take Proton and use it in their project for profit.
Even if they closed shop tomorrow, or even just gave up work on Proton itself, we’d all still reap the benefits at no cost to us.
- Comment on Gearbox founder says Epic Games Store hopes were “misplaced or overly optimistic” 2 months ago:
Epic has exclusivity on release
Wait, really? It’s officially off my list now. Screw those guys.
- Comment on Gearbox founder says Epic Games Store hopes were “misplaced or overly optimistic” 2 months ago:
Find me another company that supports open source and Linux the way Valve does… I’ll wait
- Comment on Gearbox founder says Epic Games Store hopes were “misplaced or overly optimistic” 2 months ago:
No digital game store is worth your loyalty.
When that store is run by a company that contributes massively to open source and works harder and puts more money into enabling alternate platforms for gaming than all other companies combined; ya, they have my loyalty.
- Comment on Gearbox founder says Epic Games Store hopes were “misplaced or overly optimistic” 2 months ago:
I would love to see reasonable competition to steam which would give consumers and developers better options
No one’s going to compete with and outdo Steam with Linux support.
- Comment on What fresh fuckery is this? 2 months ago:
- Comment on What fresh fuckery is this? 2 months ago:
+1 for Greyjay
- Comment on What fresh fuckery is this? 2 months ago:
Install GreyJay
- Comment on What fresh fuckery is this? 2 months ago:
They’re already talking about breaking up Google/Alphabet
- Comment on the final boss after you clear Donald Knuth 3 months ago:
I heard a first earther recently say it as: pe-tha-gore-ian
- Comment on Awnings: a simple cooling tech we apparently forgot about 3 months ago:
I just got awnings installed two months ago on the windows that get sun for most of the day. It dropped the temps in those rooms by almost 8 degrees Celsius on hot days. The AC even runs less during the day now.
They’re simple retractable awnings that a local guy installed for me. I used to hate the idea of awnings, but the thought about IR heat getting trapped clicked with me recently and suddenly the idea of awnings seemed brilliant.
- Comment on Can't argue with that logic 4 months ago:
They don’t
- Comment on answer = sum(n) / len(n) 4 months ago:
The difference in people is that our brains are continuously learning and LLMs are a static state model after being trained. To take your example about brute forcing more data, we’ve been doing that the second we were born. Every moment of every second we’ve had sound, light, taste, noises, feelings, etc, bombarding us nonstop. And our brains have astonishing storage capacity. AND our neurons function as both memory and processor (a holy grail in computing).
Sure, we have a ton of advantages on the hardware/wetware side of things. Okay, and technically the data-side also, but the idea of us learning from fewer examples isn’t exactly right. Even a 5 year old child has “trained” far longer than probably all other major LLMs being used right now combined.
- Comment on answer = sum(n) / len(n) 4 months ago:
But they are. There’s no feedback loop and continuous training happening. Once an instance or conversation is done all that context is gone. The context is never integrated directly into the model as it happens. That’s more or less the way our brains work. Every stimulus, every thought, every sensation, every idea is added to our brain’s model as it happens.
- Comment on answer = sum(n) / len(n) 4 months ago:
The big difference between people and LLMs is that an LLM is static. It goes through a learning (training) phase as a singular event. Then going forward it’s locked into that state with no additional learning.
A person is constantly learning. Every moment of every second we have a ton of input feeding into our brains as well as a feedback loop within the mind itself. This creates an incredibly unique system that has never yet been replicated by computers. It makes our brains a dynamic engine as opposed to the static and locked state of an LLM.