Someone mentioned Neural Radiance Caching to me recently, which Nvidia’s been working on for a while. They presented about it at an event in 2023 (disclaimer: account-gated and I haven’t watched - but a 6-minute “teaser” is available: YouTube).
I don’t really understand how it works after having skimmed through some stuff about it, but it sounds like it could be one of several ways to improve this specific problem?
It’s at least used in RTX Global Illumination as far as the nvidia site mentions it, and I heard rumors about Cyberpunk getting it, but unsure if it’s used in current tech or not. I think I heard mentions of it in some graphics review of a game.
Yeah I’m also confused about its current state / status in currently-released games. It looks like a significant enough of a feature that I would naively assume that if it was implemented in a currently-released game that the devs would boast about it, so I guess it’s not there yet?
Someone mentioned Neural Radiance Caching to me recently, which Nvidia’s been working on for a while. They presented about it at an event in 2023 (disclaimer: account-gated and I haven’t watched - but a 6-minute “teaser” is available: YouTube).
I don’t really understand how it works after having skimmed through some stuff about it, but it sounds like it could be one of several ways to improve this specific problem?
It’s at least used in RTX Global Illumination as far as the nvidia site mentions it, and I heard rumors about Cyberpunk getting it, but unsure if it’s used in current tech or not. I think I heard mentions of it in some graphics review of a game.
Yeah I’m also confused about its current state / status in currently-released games. It looks like a significant enough of a feature that I would naively assume that if it was implemented in a currently-released game that the devs would boast about it, so I guess it’s not there yet?