Enriched pictures with more immersive and vivid colours, extended contrast, deeper shadows and brighter highlights. Sounds pretty good? Well that’s what High Dynamic Range can bring.
It’s the biggest thing in our industry at the moment and is soon to revolutionise all parts of production. You really need to know about it in full.
Here, Shift 4’s Technical Director Colin Coomber discusses exactly what defines HDR, how it impacts on production processes and what it means for your equipment hire decisions.
The nature of HDR and the full explanation of it here means things are gonna get technical. We’ve broken it down into sections with handy headers; if you’ve had your fill, get it and want to move on, or it’s just all too much, scroll on to the next header.
High Dynamic Range concerns both image capture and image display and, to fully understand it, a grip on the specifics of how cameras and displays work together is required.
In the simplest terms, a digital camera’s dynamic range is the total range from the darkest shadow to the brightest highlight that it can resolve in a single shot. It’s measured in stops; the more stops of light that a camera’s sensor can see, the higher the dynamic range. (Modern cameras capture a dynamic range of 14-16 stops when shooting in Log).
Monitors follow recommendations that set out exactly how they should display colours and brightness. Rec709 HDTV (short for ITU-R Recommendations, Broadcast Television, number 709), otherwise known as SDR or Standard Dynamic Range, came into being 20 years ago and recommended that monitors should display 6 stops of linear, uncompressed dynamic range with a peak brightness of 100 NITs. This was based on the lowest performing technology at the time: CRT (cathode ray tube) screens. Since then technology has moved on significantly: monitors can now display more colour gamut and luminance and cameras are capable of capturing more too – which is where HDR comes in.
High Dynamic Range, then, is a new way of capturing images and displaying them, with contrast, colour and luminance capable of producing an overall highlight brightness level of more than 1,000 NITs. This blows SDR out of the water which, by comparison, has a maximum brightness of 100 NITs. Rather than the outdated Rec709 colour space, HDR, at its maximum, uses Rec2020.
Got it?! Stick with us…
This extended information within an image brings picture quality close to that seen in real life: the concept of HDR is that a captured image has the same level of brightness, colour, clarity and sharpness as the human eye.
All of this, of course, is a creative choice and greater luminance may not be the desired look, but in providing a broader spectrum HDR delivers more choice. Cinematographers are excited about it because in general it produces crisper, clearer pictures and a more immersive feel for the viewer.
HDR is becoming more commonplace: cameras have been able to capture a high dynamic range for a long time and monitor technology has recently taken great leaps towards HDR too.
There are currently multiple standards in operation for HDR: HDR10, Dolby Vision HDR and HLG, HDR10+ and the newly announced Advanced HDR By Technicolor. You could call it a war between directly competing technologies.
HDR10 is more widely used, can be handled by all HDR TVs and has been adopted by numerous manufacturers so in essence it’s the industry standard. It’s an open source format meaning it’s free to use. It supports up to 4,000 NITs peak brightness with a 1,000 peak brightness target, 10 bit colour depth and is capable of displaying everything in the Rec2020 colour space.
Dolby Vision HDR was originally developed by Dolby for cinemas but has been adapted for the home. It requires the payment of a license fee. It’s a more premium standard that isn’t as well adopted but offers substantially better quality. It supports up to 10,000 NITs peak brightness with a current 4,000 NIT peak brightness target, 12 bit colour depth and is capable of displaying everything in the Rec2020 colour space. All Dolby Vision HDR TVs support both Dolby Vision HDR and HDR10, whereas HDR10 TVs on the other hand don’t support Dolby Vision HDR.
Specifically, Dolby Vision HDR adds dynamic metadata to the core HDR image data. It carries scene-by-scene instructions that Dolby Vision HDR capable monitors can use to make sure the content is displayed as accurately as possible according to their own particular NIT levels. This means the best picture quality will be displayed whatever a monitor’s NIT value, for example a 4,000 NIT picture for a 4,000 NIT monitor, a 2,000 NIT picture for a 2,000 NIT monitor, a 1,000 NIT picture for a 1,000 NIT monitor and everything in between. In doing this, a pleasing HDR image is produced to the creator’s original artistic intention whatever the monitor’s capabilities. Dolby Vision HDR is designed to preserve the information that was originally captured and pass it on.
For HDR10, content creators produce one master but don’t add the additional metadata required for Dolby Vision HDR. An HDR TV therefore receives only static metadata, which is relatively basic information about the content being shown relating to the piece in its entirety rather than scene by scene. This means it won’t continually optimise images for the screen they’re being displayed on. Instead the information above the screen’s NIT capability will be clipped to white and won’t be displayed at all. For example if something mastered to 2,000 NITs is played on a 1,000 NIT monitor, all information from 1,000-2,000 NITs will be clipped to white, or if something mastered to 1,000 NITs is played on a cheaper 400 NIT TV, 400-1000 NITs will clipped to white.
As Dolby Vision HDR is built on the same core as HDR10, it makes it relatively straight forward for producers to create Dolby Vision HDR and HDR10 masters together if desired.
Dolby Vision HDR is the current preferred standard for feature films and high end television and only tends to be utilised in these genres due to the costs involved.
HDR10+ has been developed by Samsung and Amazon to mimic the Dolby system. It claims to improve on the standard HDR10 by adding a layer of extra scene-be-scene information to help TVs better handle playback.
HLG (or Hybrid Log Gamma) has been developed by the BBC and Japanese broadcaster NHK and is designed to deliver a more convenient solution for broadcast television. It combines standard dynamic range and high dynamic range in just one video signal that can play as SDR on SDR screens or HDR on HLG HDR screens (as different parts of the image’s gamma/log curve are used or ignored), providing options that cover a wide range of viewers. The “one size fits all” signal doesn’t take up extra bandwidth and works within existing broadcast workflows, and therefore could be feasible with live productions. The plan is that HLG ready televisions or HLG firmware upgrades for existing TV sets will be available. However there are question marks over whether HLG will deliver true HDR content – possibly HDR quality will be compromised and pictures won’t be as dynamic. The system is still in development so watch this space.
Advanced HDR by Technicolor is the newest HDR format and is the result of a collaboration between LG and Technicolor. As with other types of HDR, the content needs to be mastered in that format, played back by a source that can read the Advanced HDR data and then displayed by a compatible screen.
It seems likely that, moving forward, the HDR standards will work side by side as opposed to one winning out over the other. Feature films will remain in Dolby Vision HDR, streaming services and Blu-ray will mainly use HDR10/Dolby Vision HDR and broadcasters will use HLG for television, potentially with some content in HLG for lower budget productions on streaming services like Netflix or Amazon Video.
No, they’re different things. 4K is about more pixels whereas HDR is about better pixels. Technically speaking HDR is not dependent on resolution so in theory you can have an SD HDR image, an HD HDR image and a 4K HDR image – and beyond.
Having said that, the two are linked. Increasingly, consumers are looking for TVs that are capable of providing both 4K and HDR, and manufacturers are hooked on creating them too, hence the vast majority of HDR compatible TVs on the market are also 4K/UHD.
The UltraHD Alliance has said that, in order to reach the HDR Premium Standard, HDR televisions must have a minimum brightness of 1,000 NITs, 0% black at at least 0.005 NITs, at least 90% of DCI-P3 colour space and a resolution of UHD or 4K. Which means HDR content will have to be shot in 4K. Broadcasters are challenging this however, suggesting that an HD HDR standard should be set as 4K can’t yet be broadcast with current UK bandwidth but HD HDR can. In addition to that, some content producers see HDR as being more important than an increased resolution when it comes to quality. All this could give rise to more emphasis on HD HDR in the future.
Capturing in 4K will still be a smart decision though, to protect for multiple standards and future-proof.
THE RIGHT CAMERA
In simple terms, Log is the particular way in which brightness is recorded by a camera. Log, or a logarithmic curve, stores a good range of shadow and highlight detail suitable for HDR.
Recording with a bit rate of 10 is the minimum for HDR (see the HDR10 standard). If anything more than a basic grade is intended, it’s best to record in 12 bit just to give yourself something to play with.
4K acquisition is advisable. If you plan to master in HD HDR, it will be possible to downgrade from 4K but at least you’ll have the option to go 4K if you need to in future.
All this means you need to look at cameras that shoot ProRes 4444 12 bit or RAW, so ARRIs, REDs, Sony’s F55 and F5 or Canon’s C300 Mk II and C700. Depending on the camera, you may need an external recorder.
AN HDR MONITOR
HDR is the total range from the darkest shadow to the brightest highlight in a single shot. This means that when shooting HDR you must be able to discern visually (using a monitor) or measure (with a light meter) brightness throughout the range and at either end.
An HDR monitor of at least 1,000 NITs will allow you to accurately check highlight information in the field. A range of monitors are now available specifically for HDR shooting: Shift 4’s SmallHD 1303 HDR Monitor has 1,500 NITs of brightness and image clarity that’s enough to accurately pull focus outdoors. Shooting outdoors or in a bright environment can make artistic decisions relating to dynamic range hard to judge as the human eye will adjust to brightness.
A LIGHT METER
A light meter is crucial when shooting HDR to accurately calculate the dynamic range of a scene. We recommend you add a Spotlight Meter to your shooting kit. It gives a precise measurement of exactly how much light is bouncing off a subject/object and hitting the camera lens. This information then allows you to select your exposure to create the desired brightness. If the difference between the lightest thing and darkest thing in your frame is bigger than your camera can handle, brightness will be clipped and shadows will be noisy.
HDR isn’t easy: it requires an experienced DOP to correctly measure brightness across each shot. These methods are exactly the ones that DOPs have been employing since day one of film. Self-shooters need to take extra care.
The main thing? Exposure. HDR means there’s much less room for error. Log curves can generally capture a minimum of 12 usable stops of information, sometimes more. Standard Dynamic Range can display 6 stops of information which leaves 6 stops remaining to allow for over exposure. HDR (when mastering for 1,000 NITs) displays 10+ stops of information which only leaves 2 stops remaining and therefore much less room for error. If you choose to master for 2,000 or 4,000 NITs there’s little to no room for error at all. Your camera operator needs to be able to light and expose properly and as perfectly as possible to fit your image into the Log curve.
HDR also means it will no longer be possible to “hide” flaws in an image’s highlights and shadows as they won’t be rolled-off (or compressed) nearly as much as they would in SDR.
In addition, extended blacks will require particular attention. HDR monitors display more information in shadows so there’s less tolerance for noisy blacks. Over exposing by a minimum of 1 stop will counteract this but, of course, will contribute to the slim room for error described above.
HDR acquisition means huge files: 4444 12 bit or RAW and 4K. Backing up and storing your rushes will require more space and therefore more money. An experienced DIT, or at least data wrangler, will be crucial.
Post production workflows will need to be able to handle the increased file sizes. You’ll need to ensure the suites you’re using have HDR screens for accurate monitoring. To bring out the true beauty of HDR, you’ll need to spend time in a grade with a qualified and experienced colourist. You’ll also need to allow time and money for the creation of multiple masters of your project for differing SDR and HDR standards, as required.
Many post houses are already set up for HDR production. Consult yours to find out the details.
We predict not long. Across the industry, film and programme makers are already getting to grips with HDR. As soon as consumers cotton on to the idea that HDR is vastly superior to SDR, the demand for the format will truly take off.
Netflix and Amazon already provide full HDR content and mobile phones and tablets, which are already super bright, are HDR compatible. YouTube currently allows users to upload HDR video. HDR advertising boards will be far more eye catching than SDR versions; once HDR gains popularity in the advertising world HDR production will be in greater demand.
In our view we’re about 1 year away from HDR content being favoured over SDR, probably only a couple of years until its widespread with consumers and known as the norm.
Thursday, November 2nd, 2017