I have never used an HDR display before so I’m not sure how it’s supposed to look.

I have been playing Spider-man both with and without HDR and unless I’m staring right into the sun there is literally no difference. I have always heard people talk about HDR as something incredible but I’m honestly disappointed.

I also played Tetris effect: connected and HDR seemed to just make all the menus darker, but the rest looked the same.

Have I done something wrong or is this how it is supposed to be?

  • CralderOP
    link
    fedilink
    51 year ago

    Thanks for the input. I got confused when people said Tetris effect looked “sooo much better” with HDR and I wasn’t seeing any difference at all.

    • @[email protected]
      link
      fedilink
      71 year ago

      HDR, from what I loosely understand, is related to the color gamut (the reds, greens, and blues) the display can produce. The sRGB coverage used on most displays today is the BT 709 standard. HDR is the newer DCI-P3 standard, and it covers a wider range of colors.

      But that’s why games and systems that don’t support those extra colors won’t give you any extra “oomph” on an HDR display (because it’s only coded to utilize the capabilities of an SDR display).

      I recommend this article for further reading: https://tomshardware.com/news/what-is-hdr-monitor,36585.html

      • @[email protected]
        link
        fedilink
        41 year ago

        HDR is actually the BT.2020 color gamut. Films mastered in HDR typically use DCI-P3 because that’s the standard for theaters, but it’s a smaller color gamut than BT.2020, which is what even HDR10 (the most common form of HDR with the lowest specs) supports.

        • @[email protected]
          link
          fedilink
          11 year ago

          The article I cited says that modern HDR hardware can’t actually reach BT.2020, though that’s the ultimate goal.

          Has that changed?

          • @[email protected]
            link
            fedilink
            41 year ago

            No, it can’t. Most hardware is targeting DCI-P3 (though some goes beyond it) because that’s what films are targeting in the mastering process, but HDR10 and all other HDR protocols (HDR10+, Dolby Vision, etc) all use the BT.2020 spec on the software side of things.

            In other words, the software is ahead of the hardware for now.