Long are the days that devs would need to write their own tools and even engines to put the game running. Some (like Naughty Dog) would even hack the hardware in order to bypass limitations of it.
Re-using engines has been around for basically as long as game development has existed. This idea of some mythical age when game development was more "pure" is a fantasy. What has changed is that expectations on AAA titles has grown to the point where it's extremely difficult to roll your own engine if you are committed to many, many years of work.
Not to mention, it certainly doesn't guarantee that the engine performs well. Look at Starfield or Baldur's Gate 3. Both have noticeable issues with performance, and both are built on in-house engines by their respective studios.
stardreamer@lemmy.blahaj.zone 1 year ago
The problem is that hardware has come a long way and is now much harder to understand.
Back in the old days you had consoles with custom MIPS processors, usually augmented with special vector ops and that was it. No out-of-order memory access, no DMA management, no GPU offloading etc.
These days, you have all of that on x86 plus branch predictors, complex cache architecture with various on-chip interconnects, etc… It’s gotten so bad that most CS undergrad degrees only teach a simplified subset of actual computer architecture. How many people actually write optimized inline assembly these days? You need to be a crazy hacker to pull off what game devs in the 80-90s used to do. And crazy hackers aren’t in the game industry anymore, they get paid way better working on high performance simulation software/networking/embedded programming.