High Dynamic Range refers to scenes rendered with brighter highlights, greater shadow detail and a wider range of color, for a better looking image. For gaming HDR, in contrast to, it means more than just a prettier picture: the better you can see what’s lurking in the bright and dark areas the more likely you are to avoid danger and spot clues.
It used to require that games explicitly supported HDR as well, but the introduction ofin the Xbox Series X/S and forthcoming in changes that: The operating systems can automatically expand the brightness and color ranges of nonHDR games. It’s not the same as having a game that was rendered to use the expanded ranges, but it can give it a bump to make it look better than it otherwise would.
What is HDR and why do I want it?
To deliver its magic, HDR combines an extended brightness range (well beyond the 256 levels displayable by a typical monitor), colors than are covered by the least-common-denominator sRGB gamut, profiles necessary to optimally map the color and brightness ranges of content to the capabilities of the display, a decoder in the monitor that understands the mapping and all the related technologies that tie the pieces together — not the least of which is the operating system.
For a lot of games, HDR doesn’t matter, because they don’t have lots of areas with high brightness or deep shadows, or don’t take advantage of the bigger tonal range in any meaningful way. But for games that support it, you’ll probably get better visuals for AAA games, more creeps from horror games, fewer ambushes out of the shadows in FPS games and so on.
The real question isn’t whether or not you want it. The question is how much are you willing to pay for it — not just for a display with “HDR” in its specs, but for a monitor that will deliver the image quality that we associate with HDR.
Will an HDR gaming monitor work with the Xbox Series X/S and PS5?
Yup! There are even a publicly available set of best practices for HDR game development and monitor design developed by Sony, Microsoft and a host of other relevant companies under the umbrella HDR Gaming Interest Group, for their consoles and Windows. But HGIG isn’t a standards body, nor does it certify products, so you still need to pay attention to specs.
What do I look for in an HDR gaming monitor?
The term “HDR” has become pretty diluted thanks to marketers stretching the definition to encompass displays in the most popular price range (less than $400). So to a certain point you have pay attention to multiple specs to figure out if it’s capable of a real HDR experience.
The VESA display industry association created a set of standards and criteria for conveying quality levels of HDR in consumer monitors, , which is pretty reliable as one method of crossing some choices off your list. (DisplayHDR 400 is laughable for HDR but if you’re just looking for a bright SDR monitor it’s a good bet.)
But many manufacturers have taking to referring to their monitors as, for example, “HDR 600,” which confuses things. It’s never clear whether they’re simply using it as shorthand for the equivalent DisplayHDR level and simply don’t want to pay for the logo certification program, or whether they’re just using it as misleading shorthand for the ability to hit the peak brightness level of a particular tier. it’s possible for them to run through the certification tests themselves for internal verification. (You can, too, with the DisplayHDR Test utility available in the Microsoft Store.)
That’s why it’s important to understand the important — and not-so-important — HDR-related specs.
From a spec standpoint, HDR10 support means little to nothing. Adherence to the HDR10 standard is the most basic level a monitor has to hit (and the cheapest to include) in order to call itself “HDR.” It’s simply means the monitor can support the algorithms needed by an operating system to display HDR content correctly: brightness mapping and the ability to handle the 10-bit calculations that mapping needs (for EOTF and SMPTE ST.2084 gamma), understanding how to work with the compressed color sampling in video and the capability of handling and mapping colors notated within the Rec 2020 color space.
Color and brightness
Brightness is a measure of how much light the screen can emit, usually as expressed in nits (candelas per square meter). Most desktop monitors run 250 to 350 nits typically in SDR (standard definition range), but HDR monitors also specify a peak brightness which they can hit for short periods in HDR mode. Screens that support HDR should start at 400 nits peak — at the very least — and currently run as high as 1,600. (Laptop screens are different, because they need to be viewable in different types of lighting, such as direct sunlight, so therefore benefit from higher brightness levels even without HDR support.)
For gaming and monitors in general, the color space you’re most interested in is P3, which comes in two slightly different flavors: DCI-P3 and D65 P3. In practice, they differ only by their white points; DCI is slightly warmer (6300K instead of 6500K) and was conceived for editing film. However, I frequently see DCI-P3 listed in specs where they really mean D65. That’s fine, because D65, which was spearheaded by Apple for its own displays, is the one we care about for gaming monitors. And their gamuts are identical, so unless I’m specifically differentiating between the two I refer to it simply as P3.
You’ll also commonly see gamuts listed as a percentage of Adobe RGB, which is fine as well. Adobe RGB and P3 overlap significantly; Adobe RGB is shifted a bit toward the green/cyan end of the spectrum, because printers use cyan ink, while P3 extends further out on the green/yellows. And that, in a nutshell, is why when specs say “over a billion colors” it’s meaningless. Which billion matters.
Any monitor you consider for decent HDR viewing should definitely cover much more than 100% sRGB, a space developed by HP and Microsoft in 1996 to provide least-common-denominator color matching in Windows and which is roughly equivalent to the color space of the the Rec 709 SDR video standard.
Based on my experience, I think a decent HDR monitor should be able to hit a peak brightness of between 600 and 1,000 nits and cover at least 95% of the P3 or Adobe RGB color gamut. (When Windows looks awful in HDR mode it’s the result of lower brightness capability, sRGB-only color gamut, poorly designed aspects of the operating system and math.)
All screen technologies except OLED shine a light through various layers of color filters and liquid crystal to produce an image except OLED. Most panels with backlights may display some artifacts, notably the appearance of light around the edges of a dark screen, known as backlight bleed (although the backlight is actually edge lighting). The newest backlight technology,, lets a monitor use local dimming like a TV to produce high brightness with less bleed and fewer bright halos when they appear next to dark areas; the brighter the display, the more noticeable the unwanted brightness. Mini LED is used by the latest crop of HDR displays with brightness of 1,000 nits or more.
And as with TVs, more local-dimming zones is better.
As brightness rises, so does price, which is why 400-nit displays are so appealing to both buyers and sellers. Tossing in gaming needs like a high refresh rate can boost the price even more.
Source from www.cnet.com