naqahdah
@naqahdah@my.lserver.dev
- Comment on Open source community figures out problems with performance in Starfield 1 year ago:
I’m inclined to believe this, and this likely isn’t even the whole extent of it. I’ve been playing on a Series X, but decided to check it out on my Rog Ally. On low, at 720p with FSR2 on, I’d get 25-30fps in somewhere like New Atlantis. I downloaded a tweaked .ini for the Ultra preset and now not only does the game look much better, but the city is up closer to 40fps, with most other areas being 45-60+. Makes me wonder what it was they thought was worth the massive cost that the default settings give, with no real visual improvement.
Another odd thing, if I’m playing Cyberpunk or something, this thing is in the 90%+ CPU and GPU utilization range, with the temps in the 90c+ range. Starfield? GPU is like 99%, CPU sits around 30%, and the temp is <=70c, which basically doesn’t happen playing any other “AAA” game. I could buy Todd’s comments if the frame rate was crap, but this thing was maxed out… but not getting close to full utilization on a handheld with an APU indicates something less simple.
I’m hoping the work from Hans finds its way to all platforms (in one way or another), because I’d love to use the Series X but 30fps with weird HDR on a 120hz OLED TV actually makes me a little nauseous after playing for a while, which isn’t something I commonly have a problem with.