AdrianTheFrog
@AdrianTheFrog@lemmy.world
e
- Comment on we are creators 1 day ago:
I feel like the pictures over-exaggerate the difference a bit. The wright flyer was literally made by two people in their spare time while the space program was around 4% of all federal spending and had almost half a million people working on it in some capacity.
- Comment on Perpetual motion eludes us again. 5 days ago:
This works in Kerbal Space Program
- Comment on Anon is a game dev 5 days ago:
The advantage of making your own engine is that you can specialize for your specific gameplay.
- Comment on Anon turns on raytracing 5 days ago:
For the resolution of the texture to need to be doubled along each axis, you could either have a monitor with twice the resolution, or you could be half the distance away. Most of the time games will let you get closer to objects than their texture resolution looks good for. So 4k textures still give an improvement even on a 1080p monitor.
Texture resolution is chosen on a case by case basis, objects that the player will usually be very close to have higher resolution textures, and ones that are impossible to approach can use lower resolution.
The only costs to including higher resolution textures are artist time and (usually) disk space. Artist time is outsourced to countries with cheap labor (Philippines, Indonesia, etc) and apparently no-one cares about disk space.
- Comment on Thanks to the "you need to buy a new PC for running W11" bullshit, scammers are selling ewaste at full price to inexperienced people 6 days ago:
I got the one on the top (minus storage and ram) from a local university surplus store for $30 a few years ago. Lenovo brand but same form factor.
- Comment on Anon turns on raytracing 2 weeks ago:
It says on that page that SHaRC requires raytracing capable hardware. I guess they could be modifying it to use their own software raytracing implementation. In any case it’s the exact same math for either hardware or software raytracing, hardware is just a bit faster. Unless you do what lumen did and use a voxel scene for software raytracing.
- Comment on Anon turns on raytracing 2 weeks ago:
Yeah, that’s just rasterized shadow mapping. It’s very common and a lot of old games use it, as well as any modern game. Basically used in any non-raytraced game with dynamic shadows (I think there’s only one other way to do it, just directly projecting the geometry, only done by a few very old games that can only cast shadows onto singular flat surfaces).
The idea is that you render the depth of the scene from the perspective of the light source. Then, for each pixel on the screen, to check if it’s in shadow, you find it’s position on the depth texture. If it’s further away than something else from the perspective of the light, it’s in shadow, else it isn’t. This is filtered to make it smoother. The downside is that it can’t support shadows of variable width without some extra hacks that don’t work in all cases (aka literally every shadow), to get sharp shadows you need to render that depth map at a very high resolution, rendering a whole depth map is expensive, it renders unseen pixels, doesn’t scale that well to low resolutions (like if you wanted 100 very distant shadow catching lights) etc.
Raytraced shadows are actually very elegant since they operate on every screen pixel (allowing quality to naturally increase as you get closer to any area of interest in the shadow) and naturally support varying shadow widths at the cost of noise and maybe some more rays. Although they still scale expensively many light sources, some modified stochastic methods still look very good and allow far more shadow casting lights than would ever have been possible with pure raster.
You don’t notice the lack of shadow casting lights much in games because the artists had to put in a lot of effort and modifications to make sure you wouldn’t.
- Comment on Anon turns on raytracing 2 weeks ago:
I heard the Source 2 editor has (relatively offline, think blender viewport style) ray tracing as an option, even though no games with it support any sort of real time RT. Just so artists can estimate what the light bake will look like without actually having to wait for it.
So what people are talking about there is lightmaps, essentially a whole other texture on top of everything else that holds diffuse lighting information. It’s ‘baked’ in a lengthy process of ray tracing that can take seconds to hours to days depending on how fast the baking system is and how hard the level is to light. This just puts that raytraced lighting information directly into a texture so it can be read in fractions of a millisecond like any other texture. It’s great for performance, but can’t be quickly previewed, can’t show the influence of moving objects, and technically can’t be applied to any surface with a roughness other than full (so most diffuse objects but basically no metallic objects, those use light probes and bent normals usually, and sometimes take lightmap information although that isn’t technically correct and can produce weird results in some cases)
The solution to lighting dynamic objects in a scene with lightmaps is through a grid of pre baked light probes. These give lighting to dynamic objects but don’t receive it from them.
- Comment on Anon turns on raytracing 2 weeks ago:
Still, even if any thread looks like it’s always at 60%, if a load appears and disappears very quickly and gets averaged out on the graph (as it could in an unoptimised or unusual situation) it could still be a factor. I think the only real way to know is to benchmark. You could try underclocking your CPU and see if the performance gets worse, if you really want to know.
- Comment on Anon turns on raytracing 2 weeks ago:
Really? Ambient occlusion used to be the first thing I would turn on. Anyways, 4k textures barely add any cost to the GPU. That’s because they don’t use any compute, just vram, and vram is very cheap ($3.36/GB of GDDR6). The only reason consumer cards are limited in vram is to prevent them from being used for professional and AI applications. If they had a comparable ratio of vram to compute, they would be an insanely better value compared to workstation cards, and manufacturers don’t want to draw away sales from that very profitable market.
- Comment on Anon turns on raytracing 2 weeks ago:
I haven’t personally played a game that uses more than one dynamic reflection probe at a time. They are pretty expensive, especially if you want them to look high resolution and want the shading in them to look accurate.
- Comment on Anon turns on raytracing 2 weeks ago:
That’s true, but after a few frames RT (especially with nvidia’s ray reconstruction) will usually converge to ‘visually indistinguishable from reference’ while light probes and such will really never converge. I think that’s a pretty significant difference.
- Comment on Anon turns on raytracing 2 weeks ago:
RT was three generations ago, and I don’t think they really vary the number of rays much per environment (and rt itself is an o(log(n)) problem)
- Comment on Anon turns on raytracing 2 weeks ago:
There are cases where screen space can resolve a scene perfectly. Rare cases. That also happen to break down if the user can interact with the scene in any way.
- Comment on Anon turns on raytracing 2 weeks ago:
If course, no renderer is really good enough unless it considers wave effects. If my game can’t dynamically simulate the effect of a diffraction grating, it may as well be useless.
(/s if you really need it)
- Comment on Anon turns on raytracing 2 weeks ago:
Unless you consider wireframe graphics. Idk when triangle rasterization first started being used, but it’s more conceptually similar to wireframe graphics the ray tracing. Also, I don’t really know what you mean by ‘fake it with alpha’.
- Comment on Anon turns on raytracing 2 weeks ago:
I haven’t played the finals myself but as of the pre-release version when I watched a video about it lighting didn’t update at all without raytracing enabled. It is pretty hard to get any sort of dynamic lighting without raytracing. If not impossible, depending on how you define raytracing. But basically if they have a dynamic lighting feature that works without ‘raytracing’ they have to create a whole other GI system using world-space probes and maybe even dynamically voxeliIng the entire scene. Neither of which are easy on performance, but usually not as bad as normal hardware RT and restir. Neither of those are good at reflections or fine detail, which is why games that want to look better than that usually switch to doing it the normal way.
- Comment on Anon turns on raytracing 2 weeks ago:
I feel like if you have the level of a 3070 or above at 1080p, pathtracing, even with the upscaling you need, can be an option. At least based on my experience with portal rtx.
Personally I have a 3060, but (in the one other game I actually have played on it with raytracing support) I still turned on raytraced shadows in Halo Infinite because I couldn’t really notice a difference in responsiveness. There definitely was one (I have a 144hz monitor) but I just couldn’t notice it.
- Comment on Anon turns on raytracing 2 weeks ago:
Optimization is usually possible, but it is easier said than done. Often sacrifices have to be made, but maybe it is still a better value per frame time. Sometimes there’s more that can be done, sometimes it really is just that hard to light and render that scene.
It’s hard to make any sweeping statements, but I will say that none of that potential optimization is going to happen without actually hiring graphics devs. Which costs money. And you know what corporations like to do when anything they don’t consider important costs money. So that’s probably a factor a lot of the time.
- Comment on Anon turns on raytracing 2 weeks ago:
I disagree, I think a lot of raytraced shaders successful make the game look better while still leaning into the stylized look. I also think it’s unfair to say the game looks bad originally. It doesn’t look realistic, but it has a consistent and compelling visual style.
Look at the Minecraft update trailers for example. They go in that direction even further, by simplifying all of the textures. Yet even with the perfect offline path tracing, it doesn’t look bad.
- Comment on Anon turns on raytracing 2 weeks ago:
I always loved the graphics of Portal 2 but didn’t really see the appeal of those from Portal 1. I think the “with-rtx” version was more on the portal 2 side, so I was fine with it.
- Comment on Anon turns on raytracing 2 weeks ago:
Look at Tiny Glade, it’s a great example of what raytracing can bring to a stylized game. (They did use their own raytracing pipeline different from the usual - in their own words, re-stir was overkill for what their game needed). Or like 95% of animated films. Including Arcane but excluding Stray.
- Comment on Anon turns on raytracing 2 weeks ago:
It’s not just a time limitation either tho, it also opens up a lot of room for artistic direction and game design
I don’t think you could possibly make something like Control’s shiny black blocks world look decent without raytraced reflections.
Also anything with significantly large dynamic geometry usually either needs like half of the level file size to be duplicated for every possible state, or some form of raytracing, to work at all. (There’s also things like voxel cone tracing that do their own optimized tracing but they also don’t really work in 100% of situations and come with their own visual downsides)
- Comment on Anon turns on raytracing 2 weeks ago:
I would expect that to be a normal rasterized shadow map unless you can find any sources explicitly saying otherwise. Because even 1 ray per pixel in complex triangulated geometry wasn’t really practical in real time until probably at least 2018
- Comment on A Completely Natural Conversation in the NYC Reddit 2 weeks ago:
I think the average American is making just about enough to get by, and probably would have a hard time, though not an impossible one, affording an extra ~$2k a year. (I can’t find a solid figure for the average household living wage in the US, but from what I’ve seen it’s pretty close to the average household income)
It is a bit weird to define above average wage as rich though. But there is really no definitive class border so I think it’s slightly useless to argue about. You can also define above average as rich while still directing your hate towards the .1%.
Also I don’t really detect any hate there?
- Comment on A Completely Natural Conversation in the NYC Reddit 2 weeks ago:
I feel like generally a good way to summarize it is that em dashes can be used basically anywhere there would be a pause in natural conversation. You pause to include some content, you switch topics, etc. It’s fairly intuitive.
- Comment on Why do low framerates *feel* so much worse on modern video games? 4 weeks ago:
Game design is a big part of this too. Particularly first person or other fine camera control feels very bad when mouse movement is lagging.
I agree with what the other commenters are saying too, if it feels awful at 45 fps your 0.1% low frame rate is probably like 10 fps
- Comment on Every news result on duckduckgo links to MSN 5 weeks ago:
In eastern US, first time using DDG on this device, it shows normal links (searched “Ukraine drone attack”)
- Comment on Every news result on duckduckgo links to MSN 5 weeks ago:
It is probably the most polished map. I would rather use something open source, but it makes sense why they switched.
- Comment on Every news result on duckduckgo links to MSN 5 weeks ago:
Would be nice if it had a built in calculator and unit converter, that’s a ddg feature I use a lot