It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.
That it’s reliable. The key point they’re selling is that devs don’t need to optimize their engines as much, of course obfuscated under a lot of other value-adds.
I’d go further than this and say part of our problems are generally that optimization of code isn’t a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That’s embarrassing.
That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.
Can someone explain how AI can generate a frame faster than the conventional method?
It is image processing with statiatics rather than traditional rendering. It is a completely separate process. Also, NVidia GPUs (and the new upcoming AMD ones too) also have hardware built into the chip specifically for this.
(that’s part of the grift)
Which part? I mean even if it isn’t generating the frames well, it’s still doing the work. So that capability is there. What’s the grift?
That it’s reliable. The key point they’re selling is that devs don’t need to optimize their engines as much, of course obfuscated under a lot of other value-adds.
I’d go further than this and say part of our problems are generally that optimization of code isn’t a focus anymore. Apps which merely interface with web APIs are more than 90mb sometimes. That’s embarrassing.
That an AI can step in as savior for poor coding practices, is really a bandage stuck on the root cause.