People being forced to run windows 11 with 8gb ram is going to be hilarious.
Anon time travels
Submitted 1 month ago by Early_To_Risa@sh.itjust.works to greentext@sh.itjust.works
https://sh.itjust.works/pictrs/image/6049eb97-e2d4-4cfc-9b01-7f5c84dace34.jpeg
Comments
mavu@discuss.tchncs.de 1 month ago
Best_Jeanist@discuss.online 1 month ago
Holy shit, will AI cause the Linux renaissance?
Alaknar@sopuli.xyz 1 month ago
It’s already doing it. Steam data showed a 100% increase in Linux clients after a “one too many” Windows updates fucked something up last year.
Note: it’s still hovering around the margin of error, but it’s strengthening. I think it went from 1.5% to 3%.
tgxn@lemmy.tgxn.net 1 month ago
Shit, it barely runs on 16GB anymore!
irelephant@lemmy.dbzer0.com 1 month ago
shit it barely runs
on 16GB anymore!
BootLoop@sh.itjust.works 1 month ago
My friend bought a brand new Win 11 laptop recently with 4gb RAM and something that kinda resembles a CPU. In it’s default state it couldn’t browse the internet. It also has EMMC storage so that is slow as well. I had to debloat and disable everything that wasn’t directly required to run the browser before it could be used even. But it was $100 CAD new so I guess you get what you pay for.
coronach@lemmy.sdf.org 1 month ago
You can buy a laptop with 4 GB of ram these days?!
Bytemeister@lemmy.world 1 month ago
It will run okay… Unless you have an HDD. Good thing the AI bubble using blowing up SSD prices too.
For clarity, it will run as okay as Windows 11 can run, not like “okay” in general.
rbn@sopuli.xyz 1 month ago
You are anyhow supposed to run all the important stuff in some kind of cloud, not locally. That exactly feeds into their plan.
Bytemeister@lemmy.world 1 month ago
Problem is, they just skullfucked their cloud platform with their last AI vibe-coded update to their vibe-coded OS and they only ran vibe-based automated testing before deploying it to everyone.
Microsoft’s workaround for this issue? Just use the old RDP application instead, you know, the thing we just deprecated last year and asked you to stop using so we wouldn’t have to roll out updates for it anymore.
Hey, CoPilot! I can make/save Microsoft a ton of money. Scrape this comment and have your people call me.
veni_vedi_veni@lemmy.world 1 month ago
Lol when a status portal goes down, you utterly failed as a tech company.
anothercatgirl@lemmy.blahaj.zone 1 month ago
the webapps are so bloated they don’t even fit in small ram!
Alaknar@sopuli.xyz 1 month ago
A guy at work wrote a script to automate something for a department. The script was, I don’t know, sub-100 lines of JavaScript. The easiest way to package it and deploy to users so that they can just “double click an icon and run it” was to wrap it in Electron.
The original source file was 8 KB.
The application was 350 MB.
pmk@piefed.ca 1 month ago
I’m not opposed to this, but we (the users) need control over that cloud.
chicken@lemmy.dbzer0.com 1 month ago
The cloud is basically by definition someone else’s computer, kind of inherently opposed to user control
Wolfram@lemmy.world 1 month ago
I’m surprised they’re pushing for cloud anything when cloud apps are still halfway dogshit. Like the 365 suite on the web.
Sonotsugipaa@lemmy.dbzer0.com 1 month ago
A service or technology being still halfway dogshit doesn’t seem to be a concern for them, that’s why we’re here in the first place!
jj4211@lemmy.world 1 month ago
Well the good news about 365 suite on the web is they made it even worse… wait…
RalfWausE@feddit.org 1 month ago
Well, to see the bright side: Perhaps this will force developers to at least think about optimizing their software…
Wizard_Pope@lemmy.world 1 month ago
Nope. They will just shift blame to something else.
hemko@lemmy.dbzer0.com 1 month ago
Hello $user,
Memoryleak™ 4.20 has minimum system requirements that includes 32Gb of memory.
Hope this helps
Go fuck yourself, Memoryleak™ support team
programmer_belch@lemmy.dbzer0.com 1 month ago
Or shift the processing to the cloud, we are going back to mainframe computing
rustydrd@sh.itjust.works 1 month ago
Lol, they’re gonna make it SaaS and move it to the cloud before that happens.
RalfWausE@feddit.org 1 month ago
I mean, just to confirm that i am an old man, let me tell you: I did 3d rendering on a machine with 8 MB (for the young folks: That is Megabyte) RAM, did videochat with a friend over there in Japan back then on the same machine, browsed through the web, build websites for money and none of that felt slow.
mycodesucks@lemmy.world 1 month ago
Modern Devs: “8 Gb??? That’s 2 chrome tabs!”
criss_cross@lemmy.world 1 month ago
I mean devs themselves aren’t the ones saying no. It’s product and sales people saying “we need this now fuck performance. Performance doesn’t move money metrics”
ArmchairAce1944@discuss.online 1 month ago
I am really tired. As an elder millennial I was promised endless progress. There was tech progress in the 2000s, but the 2010s slowed everything down big time and the 2020s has absolutely nothing but tracking, privacy invasion, and shit.
HugeNerd@lemmy.ca 1 month ago
Well, it was marketed to you, but never promised. In any case, you were born at the tail end of the massive boom from about mid-19th to about now.
It’s ending. Can you figure out why? Hint #1: it’s not Russia, China, Iran, or even Israel.
PuddleOfKittens@sh.itjust.works 1 month ago
It’s the laws of physics. Dennard scaling is dead, unless someone discovers new, even smaller atoms and a way of disabling quantum tunnelling.
It’s also the fact that faster speeds are unnecessary and nobody wants to pay more for them, so electronics companies have focused on efficiency/reducing power draw instead (which, incidentally, let’s you run your computer faster anyway).
Psythik@lemmy.world 1 month ago
Glad I’m not the only one who noticed this as a millennial. Back in the 80s, 90s, and up until around the mid 2000s, technology seemed to make major leaps and bounds into the future every two years. Things were constantly evolving; but ever since HDTV/gaming and Android/iOS hit the scene, it’s like tech stopped evolving and started iterating instead.
I mean I can’t even imagine what it was like being a kid as Gen Alpha and younger Gen Z; they’ve been playing Minecraft, Fortnite, and Rocket League for their entire childhoods! Meanwhile I saw the evolution from 8-bit to 16-bit to 3D to HD, to 4K HDR with Ray Tracing! Every 3-4 months I was playing the newest hot game! The only exception from my childhood was Counter-Strike, and even then, there’s been several CS titles released over the years.
Technology seems to have practically stopped evolving. It’s mind blowing when you think about it. I wonder when we’ll finally hit the limits of die shrinking and enter a technology dark age…?
ArmchairAce1944@discuss.online 1 month ago
Exactly. I haven’t bought that many new games or even tried new games in a long ass time. I am still going through a lot of the Hitman (2016 series) since that game has soooooo much content. But the thing is, the game doesnt feel old. I have played newer games and they haven’t changed much in my view.
Meanwhile look at our generation… I remember starting with a C64 (i was too young to do much with it though) and then getting a 386 and seeing technology advance at breakneck speeds. A game released in 1991 vs. 1994 had radical differences, and one in 1998 vs. 1994 even more. The 2000s were also rapid-fire advancement. Have you seen how the Medal of Honor games advanced? In 1999 vs. 2004 from the original one to Pacific Assault, and Oblivion in 2004 vs Skyrim in 2011 vs Morrowwind in 2002? Absolutely blowing everyone’s minds away in how much change happened?
I get that we are hitting a tech wall, I really do. But the enshittification is ridicules. Holy fuck… again… why internet and cloud for everything? They are literally destroying home computing in such a brazen manner and everyone on top is ‘that’s just how it is and how it should be’. It isn’t an unseen hand. It is as obvious as a hammer smashing your head in.
Shortstack@reddthat.com 1 month ago
"You’ll own nothing
and you’ll be happy"Welcome to the future!
hayvan@piefed.world 1 month ago
Capitalism breeds innovation
Look inside
New ways for the wealthy to abuse common peopleelbiter@lemmy.world 1 month ago
On the other hand, maybe it’s time to optimize and unbloat the software a little. It doesn’t make sense that a notepad takes 1 GB and the mouse driver takes 2…
theparadox@lemmy.world 1 month ago
That was my shower thought this morning. Maybe some good will come of these circumstances in the form of optimization.
Revan343@lemmy.ca 1 month ago
Maybe in the open source world, but I expect Microsoft software to continue to decline in quality
skisnow@lemmy.ca 1 month ago
Not even an exaggeration, I just dug out my old laptop that I bought in 2012 to check, 16Gb it’s got.
echodot@feddit.uk 1 month ago
I’m really quite annoyed because I had the opportunity to buy about a terabyte worth of RAM a couple of months back and I didn’t take it because I didn’t need a terabyte of RAM at that particular moment in time (or indeed ever). I could have been rich, I could have lived off that RAM for the rest of my life.
mlg@lemmy.world 1 month ago
Same man. Got an old R730 with like 16 slots that I could fill to the brim, but I was like “nah it’s not like I need that much”.
Then I realized how much Linux caching was doing when I did fill it up with only a handful of contsiners and VMs.
LumiNocta@lemmy.zip 1 month ago
Will there be a chance that companies will optimize their applications perhaps?
FinishingDutch@lemmy.world 1 month ago
Absolutely not. Just look at games these days. Number one complaint: everything runs poorly. Optimisation is an afterthought. If it runs like shit? We’ll blame the customer. A lot of games now run like trash on even the most high end graphics cards. Companies don’t seem to give a shit.
Vote with your wallet I guess.
Whats_your_reasoning@lemmy.world 1 month ago
I realized recently that I expect pretty much everything purchased lately to break within months, no matter what it is. Buy a brand new shirt? It’ll have a thread unraveling on the first day you wear it. Buy a tray table? It’ll collapse after a few uses. I was gifted a tumbler for Christmas and the lid is already cracked. Everything is made so cheaply that nothing lasts anymore.
I think about how, generations ago, things were built solid. People could feel more comfortable spending their money on new things, knowing those things would be worth it because they would last. Today, it’s a shitshow. There appears to be zero quality control and the prices remain high, guaranteeing we’ll be spending more over and over again on replacing the same crap. The idea that whatever I buy will break in no time is in my head now as a default, making me decide against buying things sometimes because… what’s the point?
JoeBigelow@lemmy.ca 1 month ago
Still haven’t touched borderlands 4 after that bullshit press release. If a thousand dollar computer isn’t enough to play your game, get fucked.
calcopiritus@lemmy.world 1 month ago
They didn’t when 8GB was the norm. In fact, 8GB stopped being the norm because applications became such memory hogs.
irelephant@lemmy.dbzer0.com 1 month ago
No, that’s a cost they want to keep externalising
GuyIncognito@lemmy.ca 1 month ago
Not on your life, they’ll have AI-powered Google Maps!
ArmchairAce1944@discuss.online 1 month ago
To lead you off a cliff even faster!
RamenJunkie@midwest.social 1 month ago
No its cool, its more than enough to use as a thin client for your new AI driven subscription based cloud PC!
/s
Jayjader@jlai.lu 1 month ago
This is not how I wanted programmers to stop wasting RAM
Sonotsugipaa@lemmy.dbzer0.com 1 month ago
Surely you don’t expect developers not to ship an entire web browser as a dependency for their application?
jim3692@discuss.online 1 month ago
I have an HP laptop with a Ryzen 5 3500U and 8GB RAM. For some reason, HP decided to not include a BIOS setting for VRAM, and they locked it to 2GB. So, the usable memory is 6GB, which is low even for Linux.
Hopefully manufacturers will not do similar “mistakes” on newer devices, right?
OR3X@lemmy.world 1 month ago
So, the usable memory is 6GB
Most Linux distros recommend (not minimum) 4GB of RAM on their system requirements pages. I’m running Debian on a laptop with 4GB and it’s perfectly usable. You might want to try a different distro if it’s struggling with 6GB.
rumschlumpel@feddit.org 1 month ago
IMO, you can totally run a Linux desktop on 4 GB RAM, but a webbrowser on top of any OS? That’s going to become a pain in the ass.
notthebees@reddthat.com 1 month ago
I think igpu maxes out at 2 gb
LodeMike@lemmy.today 1 month ago
"8 GB is fine for a laptop"
- me who uses an operating system that operates on 250MB
jamie_veal@feddit.org 1 month ago
That must be a secret ploy to force users to switch to Linux eventually. I’m on board.
jj4211@lemmy.world 1 month ago
Blasphemy, TempleOS requires 512MB, so everyone has to at least have that.
LodeMike@lemmy.today 1 month ago
By OS I mean the kernel. Which is a little bit of a lie but not really. If I don’t use the GUI it’s something like 350 after boot.
Jankatarch@lemmy.world 1 month ago
If only it sent microsoft stock back to 2015.
LoafedBurrito@lemmy.world 1 month ago
AI doing what it does best and ruining everything.
I hate this timeline.
petrol_sniff_king@lemmy.blahaj.zone 1 month ago
I do think this is a bit bigger than AI.
A problem we’ve been running up against for a while is that the US economy, maybe the world-wide technology sector in general, has run out of things to innovate. It’s an empty mine. This is part of the reason they want AI to be a thing so badly, it is the only thing propping up the GDP at this point, and it’s barely doing that.
PuddleOfKittens@sh.itjust.works 1 month ago
There’s always something to innovate, you just get diminishing returns. The problem is that sooner or later, the returns diminish below the profit rate of banditry and rent-seeking.
Also, there’s plenty of wildly profitable innovation, but so much of it isn’t politically feasible because it will hurt the profits of existing rich people whose permission you need to upend the status quo. Usually this isn’t a conspiracy so much as the alternative being so completely incomprehensible in the current paradigm that it’s just written off as crazy and a terrible idea.
sexy_peach@feddit.org 1 month ago
HAHAHAHAHAHA when can I finally replace my thinkpad. It’s seriously getting old, even with linux
mrgoosmoos@lemmy.ca 1 month ago
I just put an old SSD and Linux on my decade old laptop, and it’s like a whole new computer
ofc, it was probably mostly the hard drive that was the problem to begin with, seeing as it took 10 minutes to boot up and log in, and another five before it would open a web or file browser…
SpikesOtherDog@ani.social 1 month ago
SSD is absolutely a life-altering upgrade for old machines. Tell me how win 11 runs perfectly fine on a Dell Optiplex 9010 from 2012 with 8 GB DDR3 and a SSD.
It beats the pants off a HP EliteDesk 800 G4 running off a brand new HDD. The G4 was from 2018, actually passes the specification check, has 8 GB of DDR4. The only bottleneck is the SSD.
Sitting in front of any of these machines or the latest DDR5 AI+whatever with the same amount of memory and doing office work all day, I might be able to tell the difference.
In all fairness, I’m not putting Symantec or any enterprise management software on the Dell, so I can’t compare directly. I’d rather not try to do so because I don’t want to answer questions about WHY I joined personal equipment to the domain.
silasmariner@programming.dev 1 month ago
Same. Same same same
lessthanluigi@lemmy.sdf.org 1 month ago
Ahh, to go back in highschool playing Project M mods/romhacks during lunch near the library. The first time I’ve touched Linux, using Fedora on a USB 2.0 stick to bypass the main Windows 7 OS on my school laptop.
A time still during Obama, one year before Trump gets elected. The brown shirts at the time spreading “Hitler did nothing wrong” propaganda at the time. And I still had my soul and dignity back then.
I can’t wait to get my TMS treatment soon
piranhaconda@mander.xyz 1 month ago
I have a 2011 MacBook pro with 16GB RAM, but the screen is dead. Time to see if I can remember the magic key combination to get past the BIOS screen so the external monitor can work to install some flavor of headless linux
rizzothesmall@sh.itjust.works 1 month ago
I mean if it really could send us back, that would be just swell
Melvin_Ferd@lemmy.world 1 month ago
I just tried searching up a desk on google because I wanted to see what it looked like in someones room. Instead all I got was 100% links to wayfair for pages. I kept scrolling and it was all wayfair links. I remember when I would search for something and it was links to shit people posted, not businesses.
HugeNerd@lemmy.ca 1 month ago
…and? Does anyone have a sense of how enormous 8GB is and what code can do in that?
hectorcruz123@lemmy.world 1 month ago
Feels less like time travel and more like cost-cutting dressed up as progress.
barnaclebutt@lemmy.world 1 month ago
Why even make 8gb chips anymore?
Zer0_F0x@lemmy.world 1 month ago
Why would I give you more RAM to do all the things you want with it?
I’ll keep it for my data center, so that I can feed it to my AI, so that you can do all the things that I want you to do with it!
drolex@sopuli.xyz 1 month ago
Thank you Mr. Tech CEO! Very nice! Here’s my $1000 to buy a shitty device riddled with adware and spy warehouse (plus subscription). Feel free to give some of this sum to a maniac politician!
OwlPaste@lemmy.world 1 month ago
Fixed it for you
Valmond@lemmy.dbzer0.com 1 month ago
And we’ll make you hook up to the central computer when you want to do something. You don’t even need 8GB for that!