Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.
Nvidia Announces RTX 50's Graphic Card Blackwell Series: RTX 5090 ($1999), RTX 5080 ($999), RTX 5070 Ti ($749), RTX 5070 ($549)
Submitted 1 month ago by simple@lemm.ee to games@lemmy.world
https://www.theverge.com/2025/1/6/24337396/nvidia-rtx-5080-5090-5070-ti-5070-price-release-date
Comments
KamikazeRusher@lemm.ee 1 month ago
Stovetop@lemmy.world 1 month ago
Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn’t seem worth the investment.
At least Intel are trying to keep their prices low.
GoodEye8@lemm.ee 1 month ago
Actually AMD has said they’re ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don’t think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn’t care that they’re paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.
MDCCCLV@lemmy.ca 1 month ago
As always, buying a used previous gen flagship is the best value.
simple@lemm.ee 1 month ago
They’ll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn’t announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.
TonyOstrich@lemmy.world 1 month ago
Weird completely unrelated question. Do you have any idea why you write “Anyway” as “Anyways”?
It’s not just you, it’s a lot of people, but unlike most grammar/word modifications it doesn’t really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don’t remember people writing or saying “anyway” with an added “s” in anyway but ironically 10-15 years ago, and I’m curious where it may be coming from.
sturmblast@lemmy.world 1 month ago
AMD is the competition.
tburkhol@lemmy.world 1 month ago
So much of nvidia’s revenue is now datacenters, I wonder if they even care about consumer sales. Like their consumer level cards are more of an advertising afterthought than actual products.
MDCCCLV@lemmy.ca 1 month ago
You have to keep inflation in mind. 550 would be 450 2019 dollars.
KamikazeRusher@lemm.ee 1 month ago
Yeah, I keep forgetting how much time has passed.
Bought my first GPU, an R9 Fury X, for MSRP when it launched. The R9 300 series and GTX 900 series seemed fairly priced then (aside from the Titan X). Bought another for Crossfire and mining, holding on until I upgraded to a 7800 XT.
Comparing prices, all but the 5090 are within $150 of each other when accounting for inflation. The 5090 is stupid expensive. A $150 increase in price over a 10-year period probably isn’t that bad.
I’m still gonna complain about it and embrace my inner “old man yells at prices” though.
Strider@lemmy.world 1 month ago
Don’t forget to mention the huge wattage.
More performance for me is more identical fps at the same amount of power.
geneva_convenience@lemmy.ml 1 month ago
By rendering only 25% of the frames we made DLSS4 100% faster than DLSS3. Which only renders 50% of the frames! - NVIDIA unironically
ZeroHora@lemmy.ml 1 month ago
You living in the past, rendering 100% of the frames is called Brute Force Rendering, that’s for losers.
With only 2k trump coins our new graphic cards can run Cyberpunk 2077, a game from 4 years ago, at 30 fps with RTX ON but you see with DLSS and all the other
crapmagic we can run at 280 FPS!!! Everything is blurry and ugly as fuck but look at the numbers!!!
Blackmist@feddit.uk 1 month ago
They’re not even pretending to be affordable any more.
merthyr1831@lemmy.ml 1 month ago
Nvidia is just doing what every monopoly does, and AMD is just playing into it like they did on CPUs with Intel. They’ll keep competing for price performance for a few years then drop something that drops them back on top (or at least near it).
werefreeatlast@lemmy.world 1 month ago
Buy 800 of those or buy a house. Pick.
Masamune@lemmy.world 1 month ago
I can’t do either of those. I chose Option C, a hospital visit.
VindictiveJudge@lemmy.world 1 month ago
Unfortunately, that’s the anti-scalper countermeasure. Crippling their crypto mining potential didn’t impact scalping very much, so they increased the price with the RTX 40 series. The RTX 40s were much easier to find than the RTX 30s were, so here we are for the RTX 50s. They’re already on the edge of what people will pay, so they’re less attractive to scalpers. We’ll probably see an initial wave of scalped 3090s for $3500-$4000, then it will drop off after a few months and the market will mostly have un-scalped ones with fancy coolers for $2200-$2500 from Zotac, MSI, Gigabyte, etc.
b34k@lemmy.world 1 month ago
The switch from proof of work to proof of stake in ETH right before the 40 series launch was the primary driver of the increased availability.
nova_ad_vitum@lemmy.ca 1 month ago
The existence of scalpers means demand exceeds supply. Pricing them this high is a countermeasures against scalpers…in that Nvidia wants to make the money that scalpers would have made .
MDCCCLV@lemmy.ca 1 month ago
Not really a countermeasure, but the scalping certainly proved that there is a lot of people willing to buy their stuff at high prices.
inclementimmigrant@lemmy.world 1 month ago
This is absolutely 3dfx level of screwing over consumers and all about just faking frames to get their “performance”.
Breve@pawb.social 1 month ago
They aren’t making graphics cards anymore, they’re making AI processors that happen to do graphics using AI.
Knock_Knock_Lemmy_In@lemmy.world 1 month ago
What if I’m buying a graphics card to run Flux or an LLM locally. Aren’t these cards good for those use cases?
Fluffy_Ruffs@lemmy.world 1 month ago
Welcome to the future
daddy32@lemmy.world 1 month ago
Except you cannot use them for AI commercially, or at least in data center setting.
TastyWheat@lemmy.world 1 month ago
“T-BUFFER! MOTION BLUR! External power supplies! Wait, why isn’t anyone buying this?”
Zarxrax@lemmy.world 1 month ago
LOL, their demo shows Cyberpunk running at a mere 27fps on the 5090 with DLSS off. Is that supposed to sell me on this product?
Poopfeast420@discuss.tchncs.de 1 month ago
The 4090 gets like sub 20fps without DLSS and stuff. Seems like a good improvement.
Blackmist@feddit.uk 1 month ago
Their whole gaming business model now is encouraging devs to stick features that have no hope of rendering quickly in order to sell this new frame generation rubbish.
sturmblast@lemmy.world 1 month ago
I’ll just keep buying AMD thanks.
deur@feddit.nl 1 month ago
Okay losers, time for you to spend obscene amounts to do your part in funding the terrible shit company nvidia.
bitjunkie@lemmy.world 1 month ago
I’m sure these will be great options in 5 years when the dust finally settles on the scalper market and they’re about to roll out RTX 6xxx.
frezik@midwest.social 1 month ago
Scalpers were basically non existent in the 4xxx series. They’re not some boogieman that always raises prices. They work under certain market conditions, and there’s no particular reason to think this generation will be much different than the last.
Critical_Thinker@lemm.ee 1 month ago
The 4090 basically never went for MSRP until Q4 2024… and now it’s OOS everywhere.
bitjunkie@lemmy.world 1 month ago
Scalpers were basically non existent in the 4xxx series.
Bull fucking shit. I was trying to buy a 4090 for like a year. Couldn’t find anything even approaching retail. Most were $2.3k+.
KingThrillgore@lemmy.ml 1 month ago
Two problems, they are big ones:
- The hardware is expensive for a marginal improvement
- The games coming out that best leverage the features like Ray tracing are also expensive and not good
frezik@midwest.social 1 month ago
Nvidia claims the 5070 will give 4090 performance. That’s a huge generation uplift if it’s true. Of course, we’ll have to wait for independent benchmarks to confirm that.
The best ray tracing games I’ve seen are applying it to older games, like Quake II or Minecraft.
lazynooblet@lazysoci.al 1 month ago
I expect they tell us it can achieve that because under the hood DLSS4 gives it more performance if enabled.
But is that a fair comparison?
caut_R@lemmy.world 1 month ago
My last new graphics card was a 1080, I‘ve bought second hand since then and will keep doing that cause these prices are…
Don_alForno@feddit.org 1 month ago
I’m still using that 1080Ti and currently see no reason to upgrade.
DoucheBagMcSwag@lemmy.dbzer0.com 1 month ago
FF VII rebirth will require an RTX 20
but I think that’s down to lack of optimization
Critical_Thinker@lemm.ee 1 month ago
The far cry benchmark is the most telling. Looks like it’s around a 15% uplift based on that.
Subverb@lemmy.world 1 month ago
About two months ago I upgraded from 3090 to 4090. On my 1440p I basically couldn’t tell. I play mostly MMOs and ARPGs.
darthsid@lemmy.world 1 month ago
Shouldn’t have upgraded then….
Critical_Thinker@lemm.ee 1 month ago
Those genres aren’t really known for having brutal performance requirements. You have to play the bleeding edge stuff that adds prototype graphics postprocessing in their ultra or optional settings.
When you compare non RT performance the frame delta is tiny. When you compare RT it’s a lot bigger. I think most of the RT implementations are very flawed today and that it’s largely snake oil so far, but some people are obsessed.
I will say you can probably undervolt / underclock / power throttle that 4090 and get great frames per watt.
vane@lemmy.world 1 month ago
Nvidia Core i5.
vane@lemmy.world 1 month ago
More Shenanigans. Moooore !
TheFeatureCreature@lemmy.world 1 month ago
No thanks; I’m good. Still feeling the sting over buying my 4080 Super last spring. Also it’s doing me just fine for my work and for games.
makingStuffForFun@lemmy.ml 1 month ago
I’m waiting for that new Intel gear.
TheHobbyist@lemmy.zip 1 month ago
The performance improvements claims are a bit shady as they compare the old FG technique which only creates one frame for every legit frame, with the next gen FG which can generate up to 3.
All Nvidia performance plots I’ve seen mention this at the bottom, making comparison very favorable to the 5000 series GPU supposedly.
Tetsuo@jlai.lu 1 month ago
Thanks for the heads up.
I really don’t like that new Frame interpolation tech and think it’s almost only useful to marketers but not for actual gaming.
At least I wouldn’t touch it with any competitive game.
Hopefully we will get third party benchmarks soon without the bullshit perfs from Nvidia.
Poopfeast420@discuss.tchncs.de 1 month ago
On the side with the performance graphs, Farcry and Plague Tale should be more representative, if you want to ignore FG. That’s still only two games, with first-party benchmarks, so wait for third-party anyway.