A wall covered in screens

*This article was originally published January 20, 2022, on IBC365

While bringing UHD and HDR into production is hardly without its challenges, the results can often be fantastic.

Recent years have seen a huge amount of activity around both UHD and HDR in broadcast production. HDR promises to be a genuine game-changer because it’s clearly visible to the end consumer – even nontechnical viewers can see a difference between HDR and traditional SDR.  Live HDR production is being used today primarily in sports, but other areas of production are increasingly realising its potential. But there’s no doubt that incorporating HDR – and UHD – brings with it considerable challenges.

As a leading provider of broadcast services, NEP is ideally positioned to track the impact of these new technologies. Speaking in a recent episode of the IBC Podcast, NEP UK Head of Engineering and Technology, Broadcast and Media Services Malcolm Cowan remarked about the complexity: “We now have an order of magnitude more workflows to think about in the world of UHD and HDR than we ever did in the past, although when you get it right, it looks fantastic – and when you add immersive audio, it sounds fantastic as well! Unlike the ‘joys’ of 3D in the past, I think it’s here to stay.”

At this point in time, however, HDR and UHD can certainly be complicated. With the majority of viewers still watching in HD and SDR, the SDR version is regarded as sacrosanct by broadcasters. In the past, this has resulted in dual chain work where you effectively had parallel productions to generate the same feed in HDR and SDR.  But as technology has evolved, you can create a single production chain that produces both versions in a reproducible manner, often using mutually-agreed LUTs for upmapping and downmapping, and so on.

All this still raises the question of who is responsible for creating the HD/SDR version from the UHD/HDR version – the production site team (OB van) or the client (network hub).  Moving to a single-production/single-contribution methodology requires strict organization between whoever is shooting/producing the HDR content and getting it back to the network, and how it is then arranged into the different distributions downstream. Ultimately, the goal is to produce one event, in HDR (and often UHD), and then create SDR versions as needed downstream.

The overriding challenge for the production team is that the people in the truck have to be cognisant of how the SDR will look – not just the HDR.  Most productions will include a trial conversion to SDR, available to the camera shaders and production team, just for this purpose, and this trial-version should be identical to what will be done downstream.  Camera shaders and producers need to look carefully at how the shot will look when converted to SDR – not just ‘does the HDR look beautiful?’.

A rigorous and consistent approach to documentation of these conversions is critical. As Cowan noted: “Whatever the requirements, including whether you have specific LUTs to introduce, you need to make sure that the entire process is documented.” Whereas in the past you might have simply labelled a tape with one or two details, now you are looking at 10 or 11 items of information that have to be included, just for the video. Part of the challenge is that the client itself needs a language to describe what they want, and in that regard it’s clear that LUTs have become a proxy for part of that dialogue.

So a methodical approach is hugely important, but whilst there has to be a lot of advance preparation, you also have to understand that things could still shift on the day. As Cowan remarked: “You might need additional resources or be presented with a new piece of kit that the client is keen to use, but which does not produce the required output for your equipment.”

So broadcast teams have to be able to respond quickly to game-day requirements. And while certain aspects are becoming more standardised, it’s not uncommon for broadcasters to provide their own sets of LUTs for specific events, in order to ensure consistency. OBS creating a set for the last Olympics is an instance of that. Similarly, there can be contrasting needs between individual productions – a warmer look might be sought for one show, or the white level of the graphics might have to be positioned in a different way for another – and all of those elements have to be accommodated.

Increasingly, versatile and powerful processing platforms are saving time and money. For NEP UK, Cowan confirms that Imagine Communications’ SNP platform is being used extensively to support its UHD and HDR productions: “It allows us to do many different tasks, including upmapping and downmapping, changing colour gamuts, de-interlacing, and so on. With this single box – which we use for a lot of Sky football coverage and other projects – we can achieve what would have required an entire rack to do five years ago.”

As UHD and HDR productions become more mainstream, there will be pressure to further streamline these workflows – improving the efficiencies for broadcast service teams. This will be driven by the fact that the benefits of HDR to the overall look of the production can be seen by the average viewer. UHD can certainly bring its benefits as well, but it’s with HDR that the improvement truly resonates with everyone. Most consumer displays sold today are capable of HDR, and delivering HDR to them improves the quality of the consumer viewing experience.

January 31, 2022 - By Imagine Communications
Contact Sales Icon

Contact Sales

Get in touch with our sales team to see how we can work together


Area of Interest

Inquiry Details

We’re committed to your privacy. Imagine Communications uses the information you provide to us to contact you with relevant content and updates regarding our products and services. To help personalize your experience, please update your preferences. You may also unsubscribe from these communications at any time. For more information, check out our privacy-policy.

Contact Support Icon

Help & Support

Our highly trained personnel, are ready to assist you.
If your query is urgent, please telephone us



Technical Support Contact Numbers

Americas & Canada:
Europe & Africa:
Middle East:

24x7 MyImagine Care+ Technical Support:

Customer Community Portal

Expert assistance, resources, and information available to support your organisation 24x7

Log in or sign up

portatif of Steve Reynolds


Steve Reynolds

Steve Reynolds is President of Imagine Communications, a global leader in multiscreen video and ad management solutions that broadcasters, networks, video service providers and enterprises around the world rely on to support their mission-critical operations.

Steve brings 25 years of technology leadership in the video industry to Imagine Communications. He has served as the CTO at Imagine Communications and Harris Broadcast, Senior Vice President of Premises Technology at Comcast, Senior Vice President of Technology at OpenTV, and CTO at Intellocity USA.

Steve earned a MS in Computer Engineering from Widener University and BS in Computer Science from West Chester University. As the Chairman of the AIMS Alliance and a member of SMPTE and SCTE, he has participated in numerous standards-making bodies in the cable and digital video industries. Steve also holds over 40 patents relating to digital video, content security, interactive television and digital devices.