I don’t see how this will stay consistent enough for art directors to sign off on it. It’s effectively just a hallucination based on your current video game frame.
Nvidia's DLSS 5 Is a Slap in the Face to the Art of Video Game Design
Submitted 5 hours ago by inclementimmigrant@lemmy.world to games@lemmy.world
https://www.ign.com/articles/nvidias-dlss-5-is-a-slap-in-the-face-to-the-art-of-video-game-design
Comments
pennomi@lemmy.world 5 hours ago
orlyowl@piefed.ca 2 hours ago
warmaster@lemmy.world 4 hours ago
Nvidia DLSS (Deep Learning Super Sloppificator)
knightly@pawb.social 3 hours ago
Double Latency Slow Sloppification.
NONE_dc@lemmy.world 4 hours ago
Personally, I don’t know much about this technology. That said, I’ve heard that the original purpose of DLSS was to improve gaming performance, give you more FPS, and so on.
In that sense, many—myself included—are wondering: How is this slop generator going to improve game performance? How is giving Grace from RE9 a totally different face with make up on going to improve my gaming experience?
SalamenceFury@piefed.social 4 hours ago
It makes “I fixed this ugly FEMALE character” chuds happy and that’s what matters to them.
inclementimmigrant@lemmy.world 4 hours ago
All of this upscaling when it was presented over a decade ago was to give older cards a longer lease in life and now it’s morphed into the mandatory way to get a stable framerate since developers can now just rely on DLSS and to a lesser extent FSR to get them to a acceptable framerate instead of optimization.
zewm@lemmy.world 4 hours ago
Saves payroll on artists which means more profits. #winning
snooggums@piefed.world 3 hours ago
The artists are still necessary for the slop generator to have aomething ro slop all over.
affenlehrer@feddit.org 4 hours ago
I think the idea is that you could use low resolution / detail models that take up less RAM and are faster to process and DLSS hallucinates a high res image.
BillyClark@piefed.social 4 hours ago
I find it interesting that the AI parts of the video have very little video in them. They have the original game moving along, and then they show the AI version and mostly keep it as a still. I suspect that they did this so that you can’t do a side-by-side comparison and see that the AI version doesn’t actually play as well as the original version.
Also, I’ve got to wonder about how it must feel to be an artist who worked on one of these games, and watch the thing you carefully hand-tuned to match the artistic vision of the game design be replaced by the mindless addition of wrinkles.
wraithcoop@programming.dev 2 hours ago
I think Jenson said during a presentation a long time ago that a long term goal was to have graphics being entirely generated. This seems like a big step towards that.
yggstyle@lemmy.world 1 hour ago
Nvidia realized they hit a wall on rendering… So they went full on into something that you can’t properly benchmark - especially against competitors AND prior generations.
Nvidia went full snake oil… And really would like people not to notice.
Ephera@lemmy.ml 3 hours ago
Never cared for realism in games to begin with, so don’t particularly care to comment on how it looks, but I’ve been thinking that I genuinely find it creepy.
Not just Uncanny Valley material, but also having these faces stare at you, always fully lit, it just gives me the creeps, kind of like a panopticon situation.
I don’t fucking know, if that’s my own trauma playing into that, where for the longest time, people looking at me generally meant they’re about to bully me.But either way, I’m about to head to bed and genuinely feel like there’s a 20% chance I’ll have a mild nightmare from that shit.
This whole AI craze has been a wild ride of all kinds of nightmare fuel, from depictions with missing/additional limbs to the weirdest warping of objects+limbs in those fucking generated videos. And the worst part is that some folks seem to just not see it or not want to see it, so they keep using the nightmare generators.ToiletFlushShowerScream@piefed.world 3 hours ago
Aren’t video games Art? Why does nvidia think it knows better than the artists? . And maybe I am the only one, but DLSS has never made any of my games look better ever. It’s always looked worse, and I now permanently turn that shit off and regret spending a 20 percent higher price for something I don’t use. I don’t usually agree with IGN articles, but this one hits it on the head.
Sanguine@lemmy.dbzer0.com 4 hours ago
Easy to avoid nvidia products these days anyway. Overpriced, marginal performance over AMD, and tied to AI slop globally. Never been so easy to vote with your wallet.
Shadowcrawler@discuss.tchncs.de 4 hours ago
That would only work i they still gave a shit about gamers or the consumer market in general. They became a full AI Datacenter company, thats where they make the money. I don’t think they will even produce consumer graphics cards in 5 years anymore.
Dariusmiles2123@sh.itjust.works 4 hours ago
To be honest it looks great and at first I thought about how impressive it is.
Then you think about how it modifies what was created and how it could lack consistency with a character looking a certain way in a part of the game and differently later.
I don’t know know what to think about these technologies like FSR and DLSS…
I don’t think I mind the fake frames as it could give you more performance on something like a Steam Deck, but the rest is tricky…
Player2@sopuli.xyz 2 hours ago
The problem is that these ‘features’ give you the fake appearance of performance but at the cost of actual performance. And they aren’t even good in the first place. Upscaling is like smearing vaseline all over your screen and saying “see how good it looks” when it’s just a blurry mess, especially with temporal elements. Frame generation gives artificial smoothness but doesn’t help input latency which is the part of frame rate that actually matters.
The kicker is that it costs real frames to generate fake ones. The game ends up looking and feeling worse.
snooggums@piefed.world 3 hours ago
I dislike upscaling, it always looks off to me and frequently causes edges of things to be weird kind of like aliasong but different. After upgrading from a gtx 3060, where I only used it for a few games that were hectic enough to not notice it as much, to a 9070xt I turned off upscaling globally and run everything in the native resolution. Games look so much better without dlss or fsr to me, fewer weird artifacts causing distractions.
I’ll take 20 less fps to not constsntly feel like something is off even if I can’t point out exactly what it is.
foodandart@lemmy.zip 4 hours ago
The magic here is you always have the option in most game settings, to turn it off.
KneeTitts@lemmy.world 4 hours ago
Everything tech bros touch, dies
SalamenceFury@piefed.social 4 hours ago
And there’s still people who think this is a good thing and an evolution in graphics. You can guess what their political alignments are pretty quickly.
webghost0101@sopuli.xyz 4 hours ago
The upside of enshitification in newer games and crap like this is that i am less interested in playing the new stuff and the more so enjoy replaying older games.
Which also run at peak performance.
tyrant@lemmy.world 4 hours ago
It’s crazy to watch these developers march everyone’s jobs, including their own off a cliff.
inclementimmigrant@lemmy.world 4 hours ago
I don’t think developers and artists have a choice when they’re being told by the publishers to do it.
tyrant@lemmy.world 2 hours ago
I get it, but there’s always a choice
leave_it_blank@lemmy.world 4 hours ago
If this could be applied to games like Return to Castle Wolfenstein, System Shock 2, Soldier of Fortune, Medal of Honour, Tron 2.0, Call of Duty and all other titles from that era, I would not mind in the slightest.
A man can dream…
inclementimmigrant@lemmy.world 4 hours ago
Well the last time that was done we ended up with the GTA trilogy mess with AI upscaling/remastering.
leave_it_blank@lemmy.world 3 hours ago
I did not mean that crap of course!
But if technology existed that would enhance graphics of old games to look like stuff from 2016 or so that would be pretty neat. I would definitely use it!
But that won’t happen I guess…
HerrVorragend@lemmy.world 4 hours ago
I am not completely against this technology if it can be used to ‘remaster’ older games.
In the current climate of rising hardware prices, it just seems unnecessary and tone-deaf to introduce such tech that requires not one, but TWO top of the line GPUs to run.
Yes, Nvidia is hugely into AI and machine learning, but they might be known as slopvidia soon if the keep forcing this stuff on gamers.
knightly@pawb.social 3 hours ago
It can’t, remastering games is an entirely different process that requires artistic direction. Piping a gamecs video output through an AI filter just takes the original game and smears a bunch of slop over it.
glitzer_gadze@feddit.org 3 hours ago
This article speaks to my heart
krisevol@lemmus.org 4 hours ago
Nvidia filters have been a thing for years. This isn’t any different.
raicon@lemmy.world 5 hours ago
Finally, the slopware generation
1984@lemmy.today 4 hours ago
Being young today must be fun. Just missed out on all the good stuff in the 80, 90s and 2000s to experience this mess we have now.