BCi Digital: 4K TV & HDR


When will we get UHD TV and HDR?

The move from SD to HD was a major leap for the TV and broadcast industries. For many consumers, used to good quality SD programming the move to HD was somewhat disappointing (at least in the early days), with more hype than true benefit. For others, especially those embracing the larger screen TVs, the hype did indeed translate to a much better viewing experience. For the group happy with their SD TVs much of the new hype around 3D was all but lost, as indeed it was too for many of the HD fans. We are now faced with more technologies to ‘help’ us upgrade our TVs. Ignoring the curved screen development and improvements of scan rate, connectivity, brightness / sharpness and screen technology (plasma/LCD/LED/OLED..) which have steadily improved, the main features on the table today are 4k / UHD and HDR. To assess the future success of these it’s important to look at the benefit to the TV viewer hand in hand with the broadcaster costs.

Before we do this I would like to clarify the terms 4k and UHD TV. Certainly within the consumer space these are now used as one and the same thing. However there is a subtle difference. 4k actually comes from the Digital Cinema Initiative (DCI), see ref 1. The standards body dealing with the digital formats for cinema content. And was nothing more than a times 4 resolution upgrade from the previous 2k standard (with an 8k standard also in the pipeline). Hence from 2048 to 4096 horizontal pixels. (Literally 1k is used to mean 1024, in fact its true definition in the world of computers and bits).

UHD on the other hand comes from the consumer world where it is simply a move from HD to UHD, or Ultra High Definition and so quadruples the resolution by doubling up the horizontal resolution from 1920 to 3840 pixels. (3840 x 2160 in fact). If you’re concerned why your UHD TV doesn’t seem to decode your 4k content, try transcoding the content to 3840 pixels!

(The water is further muddied by the fact that there is also another UHD TV standard at 7680 x 4320 pixels but let’s not go there).

For the rest of this discussion let’s go with 4k when discussing the broadcaster and content creation areas and UHD for the consumer world, even though the 4k terminology is more and more creeping into the consumer TV area too.

There is clearly a massive leap in data and so bandwidth requirement (or bit rate, as is more typical to talk about in the broadcast world) when moving from HD to 4k. Clearly a big investment for TV production studios and broadcasters. The infrastructure necessary from new cameras, to new editing equipment, to new work flows, content distribution, and so on is a big expense.

It is clear with the momentum behind 4k it is happening, but how quickly (or slowly) must to a large extent be dictated by the appetite and perceived benefits to the end user and hence the willingness of the broadcasters to invest. Before we assess this there are two further technological advances which play a big role.

1)  HEVC. This is High Efficiency Video Coding, also known as H.265. This allows the vast amount of data to be compressed down into much smaller sizes for consumption by our TVs and mobile devices. As MPEG-2 moved to (AVC) or H.264 enabling much easier HD content delivery (over terrestrial, satellite, Internet and Blu-ray) might the HEVC standard herald, at least to some extent, the possibility of UHD TV following along similar lines; with distribution over the same channels? With compressions of up to twice that of H.264 it could certainly cut down the UHD bandwidth burden by up to half. (The question as to how terrestrial will fare as resolutions increase is very uncertain).

2) HDR. This to me is key to the success of UHD TV. HDR is the technological ability for a camera to pick out the full detail in a scene and to allow the viewer to see it on the TV. All this in order to more closely mimic the human eye.

The eye can adapt to a particular area of interest (a fairly small region) very quickly (fast changes of pupil diameter, and slower changes in cone sensitivity). Hence anywhere the eye looks it will see the fine detail (within limits of course). Traditional cameras (TV and photographic) let a certain amount of light in depending on the overall light content, so bright whites can be washed out and dark areas become murky and dull, both loosing detail and contrast, as well as colour. See an explanation of what HDR is all about from a photographic perspective at BCi blog (ref 2). This previous blog described the roots of HDR as coming from photography, but there is an important difference. A photograph only brings out the highlight and shadow areas. It doesn’t actually change the brightness. With TV’s this can indeed happen and brightness levels do change, so this really is a true HDR change rather than a perceived one. More on this below.

Technology challenges and the impact on the Broadcast chain

a) Production and distribution

It’s my view that broadcasters will want to make sure their technology and workflows can support 4k as well as HDR. It doesn’t make much sense to put in a massive investment to then realize a few months down the road that the job is only half done. 4k support is probably better understood by broadcasters with bit rate impact and implementation of new transcoder technologies being the primary differences. My focus here will therefore be on HDR since this technology is lagging and the impacts may not be so well understood.

Clearly the camera will need to support HDR content generation. It’s all about stops! As we see in BCi previous blog about HDR (ref 2), images need to be captured at different stops and combined later. In the video world the stops are effectively a change in the image sensor sensitivity rather than actual changes in aperture or iris size. Two scans then have to be made with the different sensitivities. Hence double the frame rate and so double the data.  Clearly during the production stage this data needs to be combined to produce the desired good quality content, while avoiding any motion ghosting type of effects.

MotionJudder_s

SLR stills camera in HDR mode shows bad motion blur even at 1/1000 s exposure, likely due to delays between one photo and another at different apertures settings.

There is also the impact on the colour gamut. With larger dynamic range and 10 or 12 bits per colour channel, more colours are generated that could (and should) be displayed at the TV. The human eye can see many different colours. It has three different cone cells sensitive to different ranges of frequencies which peak in what we call red, green and blue light. To estimate the number of colours we can take an extremely simplistic view of 8 bits sensitivity per colour giving 256 x 256 x 256 = 16.7 m.

Eye_colour_Blog

According to various sources the human eye could see up to 10 million colours.

The colours we can actually distinguish between may be far less. Colour science is a very complex area of study. What is important for our TV discussion is to somehow define a palette of colours that the eye can see. Then any display device can be specified as to how it approximates to this, i.e. how much of that full range it is able to reproduce. In order to aid this the gamut is defined whereby the 3-dimensional colour range is represented by colour on a 2-dimentional chart known as a Chromicity diagram. Chromicity describes colours in terms of their hue and saturation or purity, importantly not including luminance. (This effectively treats grey as being the same as white but just a lower intensity of the same colour). An example chromicity colour space is sRGB. For colours defined to be within this space they can be accurately displayed on a suitably specified monitor. For HDR the gamut of colours are the colours that can be captured by HDR cameras. These colours being a subset of all those colours the human eye is sensitive to. This new colour regime will therefore need to be supported throughout the broadcast chain.

Another important point is the bit rate impact of HDR. Standard UHD TV’s with standard dynamic range (SDR) must be supported as well as the new UHD HDR TVs. (Of course HDR could be supported on HD TV’s too). It‘s estimated that the bit rate needs to increase by around 25% to support both SDR and HDR, or only by around 10% to support HDR only.

On top of this a larger metadata must be supported, which may, worst case, need to change on a frame by frame basis according to Dolby.

b) Receiving Technology

When it comes to receiving the broadcast at the end user all STBs and TVs must be UHD solutions. With ‘4k’ TVs more or less the norm now this isn’t necessarily such an issue. Again HDR puts a spanner in the works. The UHD TV and/or STB technology needs to be ‘in tune’ with the HDR content. An HDR TV camera can manage 14 stops which represents a very large dynamic range. Ultimately this then needs to be displayed on a TV with typically 7 or 8 stops of dynamic range today. To properly support HDR it makes sense for the TV to be more aligned with the camera. This then comes down to how many brightness levels there are between black and white, in other words contrast. In order for the TV to maximize its ability to pass on the high dynamic range it must have a high maximum brightness.

The SI unit for luminous intensity is the Candela. However, more interesting for describing TVs is the luminous intensity over an area (a unit area, per square meter in fact).  In the consumer world this is referred to as the TV NIT rating. It is a measure of the brightness capabilities of the TV. Typical HD TVs may only be around 400 NITs. In order to be able to fully pass on the benefits of HDR the TVs need to have much larger NIT ratings; 2,000 would be a nice starting point!

The discussion above about the colour gamut clearly also has impact at the receiver side. With more colours available within an HDR image the decoding for display must therefore also be updated in the STB if there is one, and the TV for final display to the end user. (Note that extensions will also need to be supported for the HVEC to support both HDR and a wider colour gamut).

TV’s will also need to properly support 4k in terms of copy protection, and fully compliant (18 Gbps) HDMI 2.0 interfaces.

All in all this amounts to a lot of investment in the delivery and the reception side of the equation. Actually producing the 4K + HDR content is the easy bit!

Standardization

The DVB-UHDT specification was ratified by the DVB over a year ago with phase 1 specification for UHD transmissions at 60Hz and 10 bit colour depth and using HEVC compression. But for HDR it’s another story.

Currently there are a number of competing standards for HDR. These originated by Dolby, NHK, Philips, Technicolor and the BBC. A US organization known as the UHD alliance (comprising Samsung, Sharp, Sony, LG, Panasonic, Disney, Twentieth Century Fox, Warner Bros., Dolby Vision, Technicolor, DirecTV, and Netflix) is pushing for standardization as is the ITU standards organization. Standardization needn’t be a case of one standard over another, but could allow multiple ways of working with selection via metadata. So I’m sure it will come. I think it has to, to fulfil the reality of 4k and above over the next few years.

Conclusion

Going back to the end users perspective, it is my belief that seeing content (especially movies) in HDR will have a profound effect on the viewing public. The experience is immediately obvious (unlike seeing the difference between an HD movie and a UHD one on a 52 inch screen). It instils into the viewer a real sense of being there and a far richer and generally all-consuming and immersive experience – in my view, far more so than 3D.

The challenge today is to move from movie production to live sports – this, in part, to ensure more commercial viability. Hollywood has already shown it can make good HDR movies, but with pay TV operators’ revenues coming from both quality movies and top sporting events this must also be brought into line with HDR. Broadcasters are unlikely to consider different work flows and production for HDR and non HDR content. Arguably from the user perspective HDR is less important in sports due to more standard constant lighting conditions, but for summer time sports, for example watching a football match with half the pitch in bright sunlight and the other in shade, we can quickly see the benefits HDR can bring to this scenario.

If the HDR standards can be finalized and the HEVC patent issues concluded then we may be lucky enough to have a very bright TV future within the next few years!

HDR, like a benevolent virus is getting everywhere. In my view it will herald in the next generation of high quality consumer TVs with real perceptible end user benefits giving all immersive, highly engaging viewer experiences for which TV viewers, I’m sure would pay a premium.

To answer the questions at the top:

  • I put the HDR standardization a year and a half away and the roll out a further two years down the road.
  • Should we wait before upgrading to an UHD or 4k TV? You decide!

ref 1 BCi blog and white paper on DCP

ref 2 BCi blog: What exactly is HDR?

Mark Massel – BCi digital

If you would like BCi to build you an HDR work flow / evaluation studio please do get in touch.

Please contact us to arrange an informal meeting to see how we can help you with  your needs – contact us

Views:  109