As far as wedding vows go, they’re not the MOST romantic… 🤷
Comment on Open source community figures out problems with performance in Starfield
notepass@feddit.de 1 year ago
The problem is so severe, in fact, that the aforementioned translation layer had to be updated specifically to handle Starfield as an exception to the usual handling of the issue.
“I had to fix your shit in my shit because your shit was so fucked that it fucked my shit”
Viking_Hippie@lemmy.world 1 year ago
NocturnalMorning@lemmy.world 1 year ago
They released on two different platforms. PCs have so much variation in hardware, it’s not surprising there are issues with it.
AFaithfulNihilist@lemmy.world 1 year ago
It’s poorly optimized code, and the comments from the top brass has been “lol your PC sux” when they can’t even get it running right on their own hardware.
It’s not the variations of PC that’s the issue, it’s a design and quality control issue. Direct X and Vulkan are the bread and butter of PC gaming. Microsoft developed direct X to establish a common graphics framework for Windows and Microsoft game studio still fucked up working with it.
uis@lemmy.world 1 year ago
common graphics framework for Windows
They could have picked Khronos’ APIs. They think they are smarter than everyone else including GPU developers.
Hadriscus@lemm.ee 1 year ago
This is just classic corpo shit, developing their own proprietary stuff when no one asked for it. Apple with Metal too. Then it falls on developers to write abstraction layers
Hadriscus@lemm.ee 1 year ago
As far as I know that’s what graphics drivers do, like, all the time. Every major title is handled specifically. I am not a developer. I heard this from engine developers
Blackmist@feddit.uk 1 year ago
This is how games and drivers have been for decades.
There are huge teams at AMD and nVidia who’s job it is to fix shit game code in the drivers. That’s why (a) they’re massive and (b) you need new drivers all the time if you play new games.
I read an excellent post a while ago here, by Promit.
www.gamedev.net/forums/topic/…/5215019/
It’s interesting to see that in the 8 years since he wrote it, the SLI/Crossfire solution has simply been to completely abandon it, and that we still seem to be stuck in the same position for DX12. Your average game devs still have little idea how to get the best performance from the hardware, and hardware vendors are still patching things under the hood so they don’t look bad on benchmarks.
frododouchebaggins@lemmy.world 1 year ago
Yes they do. We know they do because current gen consoles are frequently providing better fidelity and better stability than PC games. Not because PCs have inferior hardware. But because optimization is actually incredibly hard when your custom base is all running different hardware AND different drivers. So even when the hardware is “the same”, it’s not.
This has been true forever. It just took 30 years for high performance computing to be affordable enough to put in consoles. 30 years was a long time for PC gamers to feel superior. Now they enjoy humble pie and make comments like this on the internet to explain why things are so “bad”.
PC games are still great. Don’t let this bother you more than it should.
stonedemoman@lemmy.world 1 year ago
To attribute this most recent failure to an overabundance of hardware variety is a joke. This issue persists on all Nvidia and Intel cards. Why? Because it’s an oversight pertaining to the one thing they all share in common: their shared interaction with DirectX.
Let me repeat myself for the people in the back. The number of items they had to account for with this failure is one. One driver.
emax_gomax@lemmy.world 1 year ago
This sounds more like hardware manufacturers haven’t provided a good enough abstraction layer across their devices, or they did (vulkan) but everyone is just stuck on bad apis that don’t properly map to the abstractions for the hardware. Or even more likely the publishers cheaped out and pushed something to release when it wasn’t ready like they have been forever.
Shadywack@lemmy.world 1 year ago
It’s also a lack of specialized talent. There’s lots of great “talent” at game devs and even middleware devs. There’s just not much great talent that deals with renderers and API development. The vast majority of devs just lean on the middleware developer to push out the renderer codebase. In a situation like Bethesda running their own studio engine, they just don’t have the right people for it. This plagued the 90’s when people were trying to code for Glide, OGL, DX5,6,7,8, and 9. Many studios folded because they couldn’t get their tech to work with hardware acceleration.
Redredme@lemmy.world 1 year ago
Pc gaming is and forever will be way better then games on consoles.
Why?
I’ve 3 letters for you.
R G B
( ͡° ͜ʖ ͡°)
tbf pc gaming was always a fight for performance, I never felt superior back in the day fighting with qemm, irqs for the soundblaster or glide3d, it’s always had been a shitshow. It was a super shitshow in the nineties, it was a bit better in the zero’s and nowadays it again became a tad better.
But somehow I enjoyed that shitshow. Still do.
Redditiscancer789@lemmy.world 1 year ago
Lol
mattreb@feddit.it 1 year ago
I’ll give a different perspective on what you said: dx12 basically moved half of the complexity that would normally be managed by a driver, to the game / engine dev, which already have too much stuff to do: making the game. The idea is that “the game dev knows best how to optimize for its specific usage” but in reality the game dev have no time to deal with hardware complexity and this is the result.