To kick off our new weekly blog here on mysterybox.us, we’ve decided to publish five posts back-to-back on the subject of HDR video. This is Part 2: HDR Video Reference Hardware.
In HDR Video Part 1 we explored what HDR video is, and what makes it different from traditional video. Here in Part 2, we’re going to look at what hardware is needed for proper HDR grading (as of October 2016), and how to partially emulate HDR to get a feel for grading in the standard before investing in a full HDR setup.
New Standard, New Hardware
Alright, first, the bad news. Professional grade reference displays for HDR are expensive. And there are only two: The Sony BVM-X300 and the Dolby PRM-4220. Both cover 100% of DCI-P3 space, but the BVM-X300 operates in and covers most of BT.2020, has a 4K resolution, a peak brightness of 1000 nits, and uses OLED panels for more detail through the darks. The PRM-4220 is an excellent display, but is only 2K in resolution and 600 nits max, though it operates with a 12 bit panel for better DCI reference.
At this time, I can’t find any DCI projectors advertising HDR capabilities, though I think that a bright enough laser projector with the right LUT could emulate one, in a small environment - essentially using the LUT trick I’m going to describe below while using a projector that’s 10x brighter than it should be for the reference environment. That doesn’t mean they don’t exist, it just means you’ll need to talk to the manufacture directly. I haven’t tested this, though, so don’t quote me on it.
There's at least one reference display that claims to be HDR compatible, but really isn’t - the Canon DP-V2410. Frankly, the display is actually gorgeous and comparable to the Sony for color rendering and detail level, but it’s max brightness is only 300 nits and it’s HDR mode downscales the SMPTE ST.2084 0.0001 - 1000 nit range into the 0.01-300 nit range. This leaves the overall image rather lackluster and less impactful, though you could use it to grade in a pinch, since its response curve is right. But I wouldn’t, primarily because of MaxFALL, which I’ll cover extensively in Parts 4 and 5.
At Mystery Box we decided to go with the Sony BVM-X300 for our HDR grading. I can’t praise the look of this display enough, though I do have my gripes (I mean, who doesn’t?), but I’ll save that review for another time.
HDR Video on Consumer Displays
The most affordable option for grading in HDR is to use an HDR television. The Vizio Reference series have nice color with a 300 nit peak (in HDR mode), while the LG 2016 OLED HDR series displays have phenomenal color, with max brightness levels approaching 1000 nits.
The catch is, of course, that there is still more variation in the color of the display than in a reference display, so unless you know for certain that you’re going to be exhibiting on that specific display, be cautious when using them to grade. They also lack SDI inputs, but that’s solvable.
DaVinci Resolve Studio version 12.5+ has an option to inject flags for HDR and BT.2020 into the HDMI output of your DeckLink or UltraStudio hardware. To grade in HDR using a consumer HDR television with HDMI input, simply hook up the display over HDMI, toggle the option in your DaVinci settings and the display will automatically switch into HDR mode:
If you’re not using DaVinci Resolve Studio 12.5+, or if for whatever reason you have to route SDI out, you can inject the right metadata into the HDMI stream once you’ve converted from SDI to HDMI. What you’ll need is an Integral by HD Fury. This box, which is annoyingly only configurable under Windows, will add the right metadata into the HDMI connection between host and device, allowing you to flag any HDMI stream as BT.2020 and HDR.
BE CAREFUL if you’re using the Integral though. It can be tempting to use the HDMI output of your computer to just patch your desktop into HDR. This is a bad idea. Any interface lines will also be translated into HDR, which will limit the displays overall brightness (because you can’t switch your desktop into HDR mode), and any static elements risk burn-in. Most HDR displays use OLED panels and OLEDs are susceptible to burn-in!
If you are already using SDI outputs for your I/O, and want to switch to the BVM-X300 or the PRM-4220, you shouldn’t NEED to upgrade your I/O card or box to drive HDR - 10b 4:2:2 works for grading HDR. You might want to upgrade though if you want higher output resolutions (4K vs 2K/1080), higher frame rates at the higher resolutions (50/60p) or RGB/4:4:4 instead of 4:2:2 Chroma Subsampling.
Everything else should work with your existing color correction hardware.
Emulating HDR Video
Okay, so if you’re not ready to spring for the new reference hardware, but want to emulate HDR just to get a feel for how it works, here’s a trick you can do using a standard broadcast display, or a display like the HP Dreamcolor Z27x (which I used when doing my first tests) to partially emulate HDR.
Use a reference display with native BT.2020 color support, if you can. If you’re using Rec 709, but still want to get a feel for grading in BT.2020, there’s a fix for that using LUTs, but it’s not elegant. You can get a feel for the HDR curve in a Rec 709 color space, but you won’t get a feel for how the primaries behave slightly differently, or how saturation works in BT.2020.**
In addition, if possible, try to use a reference display with a 10 bit panel. There’s no cheat for this one, you either have it or you don’t. 8 bits will give you an idea what you’re looking at, but won’t be as clear as possible. In many cases it won’t make a difference, you’ll just lose your ability to see specific fine details.
Now, calibrate the display and your environment, to emulate HDR. Turn your maximum brightness to full (on the Dreamcolor Z27x, it peaks at 250 nits; your display may be different). Turn off all ambient lighting (as pitch black as possible). Then, turn down the brightness of the host interface display to the lowest setting that it’s still useable. Do the same for any video scopes or other external monitoring hardware that may also be hooked up to the output signal.
This should make your reference display the brightest thing in the room, by a factor of 2 to 4x. This is important. While the display will still lack ‘umph’, at very least it’ll dominate your perception of brightness. That’s key to creating the illusion of the HDR effect in this case; without it your screen will just look muted and dull.
At this point, what we’ve done by adjusting the ambient and display brightness is emulated the greater brightness range of HDR without using a display that pushes into the HDR range. Next what we need to do is adjust the display’s response so that it matches the HDR curve we want to emulate. Essentially, we need to eliminate the display’s native gamma curve for either PQ or HLG curve.
This is actually pretty easy to do in DaVinci Resolve Studio - DaVinci has a set of 3D LUTs you can attach to your output that will automatically do it for you. You’ll find them written as “HDR <value in nits> nits to Gamma <target display gamma>” (ex. HDR 1000 nits to Gamma 2.4) for the SMPTE 2084 / PQ curve, and “HLG to Gamma <target display gamma>” (ex. HLG to Gamma 2.2) for the Hybrid Log Gamma curve.
What these LUTs do, essentially, is add a 1/gamma (ex 1/2.4) contrast curve to the output signal, combined with the selected contrast curve, i.e., the one you want to see. The gamma reciprocal adjustment combines with the display’s native or selected gamma to linearlize the overall signal, as the two curves cancel each other left. The only contrast curve you’re left with, then, is the HDR contrast curve you’ve added to the signal, the HDR curve being translated into your display’s native or adapted luminance range.**
Using one of these LUTs on your monitor or output will allow your display to respond as if it were operating natively with the HDR curve, though you'll notice that your display is only showing the first 100 nits of HDR curve. We'll fix that next.
The final step is to calibrate your display’s brightness and contrast. I add a timeline node and scale the gain and gamma adjustments to bring the full HDR range back into the display's native signal range. As for adjusting the contrast, though, there’s not much I can say about how to do that, other than to use a reference image or two graded in the target space to calibrate the display until it ‘looks right’. Here are a couple that I graded in SMPTE 2084 that you can use for calibration:
All of this LUT business and brightness scaling, by the way, is exactly what the Canon DP-V2410 does, it just does it internally with a mode switch instead of needing manual setup. Don’t get me wrong - in every other respect, the DP-V2410 is an amazing display, but in HDR mode it’s equivalent to this setup for HDR emulation, rather than true HDR performance.*
Emulated HDR vs. True HDR
So how does an emulated HDR display compare to a true HDR reference display? Well, poorly is an understatement. It's not terrible, but emulated HDR lacks the power of the true HDR, the ability to grade with lights on and see how your footage holds up through the large punch of the whites. With an 8 bit panel you’re going to see stepping while grading in an emulated HDR mode, because most of the region you’ll be working in ends up compressed to 50 or so code values.
This compression in the darks means you won’t get a feel for just how deep SMPTE 2084 can go while still preserving details - you can grade whole shots with full detail in the darks and a few hundred levels of contrast, that land between codes 4 and 14 (full range) on a standard 8 bit display (especially an LED or CFL backed LCD).
You’ll also be tempted in this mode to grade ‘hot’, that is, push more into the brights of the image, since you don’t have any actual limits for frame average light levels, like all true HDR displays do. That’s not necessarily a problem, but you’ll run into trouble if you try to use the footage elsewhere. You also miss the great psychological response the actual dark and light levels of a true HDR range give you.
So why emulate then? Well, right now, HDR reference hardware is expensive. And if you want to practice grading and mastering in HDR, without having to invest in the hardware, emulation is a fantastic place to start. You’ll get to see how the mids and highs roll into the whites in SMPTE 2084, and develop tricks to make your grading more natural when you make the switch to a proper HDR display. You may even be able to grade using emulated HDR so long as you have a proper HDR television to internally QC before sending out to a client - assuming your mastering of the HDR file is right, you can check it on a television and make sure it at very least looks good there, contrast and curve wise, before sending it out to a client.
Of course, mastering HDR video is problem in and of itself, but I’m saving it for last, in Part 5 of our series. First, though, we’re going to look at the new terminology introduced with HDR video, because even if you’ve been working with video for decades, most of this is likely to be new.
* The day I went to post this I found Canon had updated their website to include the Canon DP-V2420, which they claim supports full HDR in both the ST.2084 and the HLG specifications, and be Dolby Vision qualified; I haven't had time to look into these claims.
** If you’re using the LUT on your output to emulate the HDR curve, but only have a Rec. 709 display and want to get a feel for BT.2020, you may consider using a BT.2020 to Rec. 709 LUT and stacking it with the gamma compensating LUT. In DaVinci you can do this by adding one LUT to the output, and a second LUT for the monitor, or by attaching one of the LUTs to a global node for a timeline. As a last resort, you can attach as many LUTs as you want to individual grades. You should be able to do something similar in pretty much all other color grading or mastering softwares like Scratch or Nuke.