You’re missing the point.
There are a lot of games that look much better AND run much better.
It’s not about how often you upgrade.
scrubbles@poptalk.scrubbles.tech 1 year ago
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU, those came out 2 years ago now, and that’s averaging about 50fps on a 4k monitor.
If that isn’t optimized, idk what is. Yes, I had high end stuff from 2 years ago, but now it’s solid middle range.
People are so damn entitled. There used to be a time in PC gaming where if you were more than a year out of date you’d have to scale it down to windows 640x480. If you want “ultra” settings you need an “ultra” PC, which means flipping out parts every few years. Otherwise be content with High settings at 1080p, a very valid option
You’re missing the point.
There are a lot of games that look much better AND run much better.
It’s not about how often you upgrade.
I mean, yeah but also by what metric. There’s a thousand things that can affect performance and not just what we see. We know Starfield has a massive drive footprint, so most everything is probably high end textures, shaders, etc. Then the world sizes themselves are large. I don’t know, how do you directly compare two games that look alike? Red Dead 2 still looks amazing, but at 5 years old it’s already starting to show it’s age, but it also had a fixed map size, but it got away with a few things, etc etc etc every game is going to have differences.
My ultimate point is that you can’t expect to get ultra settings on a brand new game unless you’re actively keeping up on hardware. There’s no rules saying that you have to play on 4K ultra settings, and people getting upset about that are nuts to me. It’s a brand new game, my original comment was me saying that I’m surprised it runs as good as it does on the last generation hardware.
I played Borderlands 1 on my old ATI card back in 2009 in windowed mode, at 800x600, on Low settings. My card was a few years old and that’s the best I could do, but I loved it. The expectation that a brand new game has to work flawlessly on older hardware is a new phenomenon to me, it’s definitely not how we got started in PC gaming.
People are entitled because they don’t want to spend thousands of dollars on components only for them to be outdated within a fraction of the lifecycle of a console?
How about all the people that have the minimum or recommended specs and still can’t run the game without constant stuttering? I meet the recommended specs and I’m playing on low everything with upscaling turned on and my game turns into a laggy mess and runs at 15fps if I have the gall to use the pause menu in a populated area. I shouldn’t have to save and reload the game just to get it to run smoothly.
Bethesda either lied about the minimum/recommended requirements or they lied about optimization. Let’s not forget about their history of janky PC releases, dating back to Oblivion, which was 6 games and 17 versions of Skyrim ago.
and no one is saying they have to, that’s my point that keeps getting overlooked. If someone wants to play sick 4k 120fps that’s awesome, but you’re going to pay a premium for that. If people are upset because they can’t play ultra settings on hardware that came out 5 years ago, to me that’s snobby behavior. The choice is either pay up for top of the line hardware, or be happy with medium settings and maybe you go back in a few years and play it on ultra.
If the game doesn’t play at all on lower hardware (like Cyberpunk did on release), then that is not fair and needs to be addressed. The game plain did not work for lower end hardware, and that’s not fair at all, it wasn’t about how well it played, it’s that it didn’t play.
4k 120fps would be great
But the 4090 only averages 75fps at 4k high preset. 7900xtx averages 74fps
You can go skim the Gamers Nexus review of the 7700xt, it has a portion dedicated to Starfield in it.
Idk what to tell you mate, I’m on a 3080, 1440p, and I’m getting average 60fps on 1440p My settings are all ultra except for a couple, FSR on at 75% resolution scale. To me, that’s optimized, I don’t even expect 60fps on an RPG. Cyberpunk I’ve never had higher than 50.
Consoles don’t even last their whole life time anymore, both machines required pro models to keep up with performance last gen and rumours have it Sony are gearing up for one this gen too.
I have an AMD 3800X and an RTX2070 and I am barely seeing 30fps on the lowest settings at 1080p and 1440p.
DOOM Eternal runs just fine at 144fps on High and looks miles better.
It’s just not optimised.
Doom eternal also came out 3.5 years ago now, and your card is nearly 5 years old. That’s the performance I would expect from a card that is that old playing a brand new game that was meant to be a stretch.
I’m sorry, but this is how PC gaming works. Brand new cards are really only awesome for about a year, then good for a few years after that, then you start getting some new releases that make you think it’s about time. I’ve had the 3000 series, the 1000 series, before that I was an ATI guy with some sapphire, and before that the ATI 5000 series. It’s just how it goes in PC gaming, this is nothing new
Curious if you can name one thing Starfield is doing that wasn’t possible in a game from 2017.
I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things
I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”
Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
Texture resolution has not considerably effected performance since the 90s.
Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.
Wtf are you talking about, nobody uses SSAA these days.
If you’re going to try and argue this point at least understand what’s going on.
The game is not doing anything that other games haven’t achieved in a more performant way.
Texture resolution has not considerably effected performance since the 90s.
If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.
Texture resolution has not considerably effected performance since the 90s.
lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance 😅
I wouldn’t say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved “high end rig” is mostly dated as fuck.
To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a “beefy” gpu…
So yeah get used to low framerates if you still have components from like 4 years ago
Changing graphics settings in this game barely effects performance anyway.
That’s sound like you are cpu bound…
I’m running it on a Ryzen 1600 AF and a 1070. NOT Ti. 1440 at 66% resolution. Mix of mostly low some medium. 100% GPU and 45% CPU usage. 30 fps solid in cities. I won’t complain at all. I’m just happy it runs at all solidly under minimum spec.
This is a great way to view it, and I think you’re getting excellent specs for that card. Kudos to you for getting it running !
Why do people use entitled like it is a bad thing? Why wouldn’t consumers be entitled as opposed to spending money as though it is an act of charity? Pretty weird how mindset of gamers over the years has shifted in a way where the fact that they are consumers has been forgotten.
I say entitled because gamers should just be happy, be happy with the hardware you have even if it can’t put out 4k, turn off the FPS counter, play the game. If you’re enjoying it, who cares if it occasionally dips down to 55? The entitlement comes from expecting game makers to produce games that run flawlessly at ultra settings on hardware that’s several years old. If you want that luxury, you have to spend a shitload of money on the top of the line gear, otherwise just be happy with your rig.
Products are just products designed to get money out of people. I don’t have an appreciation like its some sports team for them. It comes down to simply if it is worth spending money on or not. Being entitled is a good thing, since it encourages less consumerist behavior with how lot of people can use less frivolous spending in their lives.
You can try to spin it as a negative, but I find this you hail corporation approach to consumerism very odd.
I’m actually agreeing with you, people should be happy to play the games on their older hardware even if it can’t pull down the ultra specs. We don’t need to always be buying the latest generation of GPUs, it’s okay to play on medium specs. We don’t have to have the top of the line latest card/processor/drive, we can enjoy ours for years, even if it means newer games don’t play on ultra. If you have the funds to buy new ones every generation, more power to you, but I buy my cards to last 8-10 years. The flipside is just expect that the games won’t run on ultra.
I'm running it on a Ryzen 5 2600 and an RX 570, and it seems to run relatively well other than CTD every hour or so.
Runs great on my 5000 series AMD CPU and 3000 series Nvidia GPU
Just specifying the series doesn’t really say much. Based on that and the release year you could be running a 5600X and RTX3060 or you could be running a 5950X and RTX3090. There’s something like a ~2.5x performance gap between those.
PC gamers enjoyed a bit of a respite from constantly needing to upgrade during the PS4/Xbone era. Those machines were fairly low end even at launch and with them being the primary development formats for most games, it was easy to optimize PC ports even on old hardware.
Then the new consoles came out that were a genuine jump in tech again as consoles used to be, and now PCs need to be upgraded to keep up and people that got used to the last decade on PC are upset they can’t rock hardware for multiple years anymore.
I’m happy with my games at 1080 and I’m going to be sad when they start requiring higher resolutions.
I have a PC with 5800X, 3080 Ti, and 64 GB DDR4-3600. I play at 1440p with 80% render scale, Medium-High settings (mostly Medium) and it’s barely above 60 FPS. It runs like shit.
Why does it need to go above 60fps? It’s not a twitch FPS where every bit of latency counts. It’s an RPG and 60 is perfectly smooth.
60 FPS is quite smooth and playable but far from perfectly smooth. There’s still noticeable juddering on continuous camera motion.
I’m curious, I have a 3080 as well and I’m getting ultra across the board and I average 60fps, maybe a setting or two is at high, also 1440p. Installed on an SSD, right?
Exactly my point. I want 90 FPS at least and lowering the settings didn’t help at all.
Oh, well then I’d readjust expectations. Doom and fast paced shooters usually go up that high because they have quick fast-paced combat, but RPGs focus on fidelity over framerate. Hell, Skyrim at launch only offered 30fps, Cyberpunk I mentioned I never got above 45. 60 in an RPG is really a good time, don’t let the number on the screen dictate your experience. Comparing a fast shooter and an RPG like this is apples and oranges
I’m honestly shocked a game like this can run at 60fps. <45 and I start to get annoyed in RPGs. I’d expect if you wanted framerates that high you may be needing to window it at 1080 and lowering the settings further.
ocassionallyaduck@lemmy.world 1 year ago
I mean, this was also before video cards cost as much as some used cars or more than a month’s rent for some people.
scrubbles@poptalk.scrubbles.tech 1 year ago
I’m not saying it’s not an expensive hobby, it is. PC gaming on ultra is an incredibly expensive hobby. But that’s the price of the hobby. Saying that a game isn’t optimized because it doesn’t run ultra settings on hardware that came out 4+ years ago is nothing new, and to me it’s a weird thing to demand. If you want ultra, you pay for ultra prices. If you don’t want to/can’t, that’s 100% acceptable, but then just be content to play on High settings, maybe 1080p.
If PC gaming is too expensive in general that’s why consoles exist. You get a pretty great experience on a piece of hardware that’s only a few hundred dollars.
ocassionallyaduck@lemmy.world 1 year ago
PC gaming didn’t used to be THIS expensive.
You could build an entire machine for the cost of a 4090.
scrubbles@poptalk.scrubbles.tech 1 year ago
4090 is definitely nuts, but with inflation the 4080 is right about on par. As usual team red very close in comparison for a much lower cost. You don’t have to constantly run the highest of the high level to get those sweet graphics, but it’s about personal taste. Personally it’s not for me paying the 40% more for a 10% jump in graphics, but every 2-3 generations is when I usually step back and reanalyze. Tbh usually it’s a game like starfield that makes me think if I should get a new one. Runs great for now though, probably have at least 1 hopefully 2 more generations before I upgrade again
NuPNuA@lemm.ee 1 year ago
I don’t know if you noticed, but everything became more expensive in the last year. Food, housing, etc, it’s called inflation and PC parts aren’t immune.
phillaholic@lemm.ee 1 year ago
yea idk if used cars or rent are good comparisons.
gamermanh@lemmy.dbzer0.com 1 year ago
4090 MSRP: $1,599
Rent for a 3 bedroom in a nearby town: $1,495/month
JJROKCZ@lemmy.world 1 year ago
For only 300 more I have a mortgage on a 2000sq foot home in a large American city….
I have a 6900xt because I got a promotion recently and wanted to treat myself to get off the r9-300 series finally but it wasn’t 1600, I think I paid 1100
phillaholic@lemm.ee 1 year ago
1,500 gets you a closer with a window around me. Prices are fucked.