Comment on Am I doing HDR wrong?
Telorand@reddthat.com 11 months agoHDR, from what I loosely understand, is related to the color gamut (the reds, greens, and blues) the display can produce. The sRGB coverage used on most displays today is the BT 709 standard. HDR is the newer DCI-P3 standard, and it covers a wider range of colors.
But that’s why games and systems that don’t support those extra colors won’t give you any extra “oomph” on an HDR display (because it’s only coded to utilize the capabilities of an SDR display).
I recommend this article for further reading: tomshardware.com/…/what-is-hdr-monitor,36585.html
entropicdrift@lemmy.sdf.org 11 months ago
HDR is actually the BT.2020 color gamut. Films mastered in HDR typically use DCI-P3 because that’s the standard for theaters, but it’s a smaller color gamut than BT.2020, which is what even HDR10 (the most common form of HDR with the lowest specs) supports.
Telorand@reddthat.com 11 months ago
The article I cited says that modern HDR hardware can’t actually reach BT.2020, though that’s the ultimate goal.
Has that changed?
entropicdrift@lemmy.sdf.org 11 months ago
No, it can’t. Most hardware is targeting DCI-P3 (though some goes beyond it) because that’s what films are targeting in the mastering process, but HDR10 and all other HDR protocols (HDR10+, Dolby Vision, etc) all use the BT.2020 spec on the software side of things.
In other words, the software is ahead of the hardware for now.