• @computergeek125
    link
    English
    91 year ago

    Call me old fashioned but I’d rather see high native quality available for when it is relevant. If I’m watching gameplay footage (as one example) I would look at the render quality.

    With more and more video games already trying to use frame generation and upscaling within the engine, at what point is too much data loss? Depending on upscaling again during playback means that you video experience might depend on which vendor you have - for example, an Nvidia computer may upscale differently from an Intel laptop with no DGPU vs an Android running on 15% battery.

    That would become even more prominent if you’re evaluating how different upscaling technologies look in a given video game, perhaps with an intent to buy different hardware. I check in on how different hardware encoders keep up with each other with a similar research method. That’s a problem that native high resolution video doesn’t have.

    I recognize this is one example and that there is content where quality isn’t paramount and frame gen and upscaling are relevant - but I’m not ready to throw out an entire sector of media for this kind of gain on some media. Not to mention that not everyone is going to have access to the kind of hardware required to cleanly upscale, and adding upscaling to everything (for everyone who’s not using their PS5/Xbox/PC as a set top media player) is just going to drive up the cost of already very expensive consumer electronics and add yet another point of failure to a TV that didn’t need to be smart to begin with.

    • bufalo1973
      link
      fedilink
      English
      21 year ago

      The quality is something that depends on the content. If the video is just someone talking, 4K is overkill. And not every gameplay has to be recorded forever. Only the good ones. And even the videos can be rescaled after some time if nobody sees them.