I mean, there isn’t one thing you can point to and say “ah ha that’s causing all teh lag”, things just take up more space, more compute power, more memory as it grows. As hardware capabilities grow software will find a way to utilize it. But if you want a few things
- Textures are larger, where 4k was just getting rolling in 2017 (pre RDR2 after all), to accomodate 4K textures had to be scaled up (and remember width and height, so that’s 4x the memory and 4x the space on drive)
- Engines have generally grown to be more high fidelity including more particles, more fog, (not in Starfield but Raytracing, which is younger than 2017), etc. All of these higher fidelity items require more computer power. Things like anti-aliasing for example, they’re always something like 8x, but that’s 8x the resolution, which the resolutions have only gone up, again rising with time.
I don’t know what do you want? Like a list of everything that’s happened from then? Entire engines have come and gone in that time. Engines we used back then we’re on at least a new version compared to then, Starfield included. I mean I don’t understand what you’re asking, because to me it comes off as “Yeah well Unreal 5 has the same settings as 4 to me, so it’s basically the same”
Edgelord_Of_Tomorrow@lemmy.world 1 year ago
Texture resolution has not considerably effected performance since the 90s.
Wtf are you talking about, nobody uses SSAA these days.
If you’re going to try and argue this point at least understand what’s going on.
The game is not doing anything that other games haven’t achieved in a more performant way.
scrubbles@poptalk.scrubbles.tech 1 year ago
If this were true there wouldn’t be low resolution textures at lower settings, high resolutions take up exponentially more space, memory, and time to compute. I’m definitely not going to be re-learning what I know about games from Edgelord here.
Edgelord_Of_Tomorrow@lemmy.world 1 year ago
You’re being disingenuous mate. On a machine with adequate VRAM there is zero performance difference.
avater@lemmy.world 1 year ago
ohhhh so IT DOES affect performance at all 😂
avater@lemmy.world 1 year ago
lol. try to play a game with 4K textures in 4K on a NVIDIA graphics card with not enough vram and you see how it will affect your performance 😅
I wouldn’t say that Starfield is optimized as hell, but I think it runs reasonably and many people will fall flat on their asses in the next months because they will realize that their beloved “high end rig” is mostly dated as fuck.
To run games on newer engines (like UE5) with acceptable framerates and details you need a combination of modern components and not just a “beefy” gpu…
So yeah get used to low framerates if you still have components from like 4 years ago
That’s sound like you are cpu bound…
Edgelord_Of_Tomorrow@lemmy.world 1 year ago
If a 5950 is CPU bound then the game is badly optimised.
avater@lemmy.world 1 year ago
I don’t know and I don’t care what is wrong with your system but the amd driver tells me I’m averaging at 87fps with high details on a 5800X and a radeon 6900, a system that is now two years old and I think this is just fine for 1440p.
So yeah the game is not unoptimized, sure could use a few patches and performance will get better (remember it’s a fucking bethesda game for christ’s sake…) but for many people the truth will be to upgrade their rig or play on xbox
regbin_@lemmy.world 1 year ago
The game might be much more CPU bound on Nvidia cards. Probably due to shitty Nvidia drivers.
I have a 5800X paired with a 3080 Ti and I can’t get my frame rate to go any higher than 60s in cities.
avater@lemmy.world 1 year ago
sorry to hear that, no problems here with AMD card but I’ve been team AMD all my life so I have no expierence in NVIDIA Cards and their drivers