CJ Coverage at NAB and CinemaCon

By James Gardiner | April 6, 2015 1:03 am PDT

NAB James Gardener

NAB and CinemaCon are upon us. I am happy to announce that I will be covering them for CJ as well as CineTechGeek video-blog. This year at NAB I have a theme. “Emerging SMPTE standards and how they will be effecting Cinema Production”. Then onto CinemaCon where it is all about Laser and how it may be the next digital transition.

In this article I wanted to give you a brief overview of some of these new technologies from the more general standpoint so the average Cinema Enthusiast can understand how it will affect them. Head over the CineTechGeek videos if you are hard core geeky (The videos will take quite some time to appear after NAB/CinemaCon).

At NAB I will be aiming to interview companies like Adobe, BlackMagic, AJA, Quantel, SGO, GoPro, ARRI, CineCert, RED, Sony, Canon and possibly others. These companies are camera and post-production-tools companies. The companies that will enable these technologies in the hands of filmmakers.

VC5
VC5 is a codec like ProRes from Apple, or DNxHD from Avid. What makes VC5 interesting is that it is equivalent to ProRes/DNxHD, however it is unencumbered/FRAND based. Why is this important? Many productions are finishing their masters in one of these codecs. For legal reasons and to mitigate risk, it would be a better preference to finish in a codec without any potential future legal restriction.. Apple/Avid ideas on their codecs could change. Companies change, get sold etc.

Secondly, what gives VC5 a lot of promise over ProRes/DNxHD is that it also works on RAW camera data allowing lower powered computers to work on RAW video without the horsepower typically needed (At a est. 10% file size hit, but disk is cheap these days). RAW workflows are much heavier than typical RGB/YUV workflows but are looked at more like traditional Film in result. With HDR coming, VC5 looks like the perfect compromise of quality/performance that can answer the needs of High Dynamic Range (HDR) workflows (Ie Rec.2020).

The benefit to the general consumer is indy/lower budget films will be able to more easily take advantage of these new higher quality workflows. I.e. better pictures for not just tend pole productions/budgets.

IMF (Interoperable Master Format)

One of the biggest issues with distributing content these days is the plethora of delivery requirements. Typically when a film/TV-series is finished, that “Original-Master” is then converted into hundreds of different masters. To give you an idea, imagine each country has different TV-standards (PAL/NTSC/HD etc), different titles/languages, subtitles, Pan-And-Scan, different edits for classification etc. That one tape can have hundreds of different derivatives. IMF is designed to help reduce this madness.

Based on the Digital Cinema Package (DCP) used in distribution of feature films around the world, IMF is an enhanced version of this to answer the needs of the more general TV/Cable/OTT etc industries and their requirements.

IMF is complex but I will try to explain it in brief here.. IMF is not just assets i.e. video, audio, subtitles, metadata. The idea of IMF is to design a single set of assets then add metadata that then describes how to equate the alternative masters from it. As an example, it works like a standardised editing process. Ie, instead of getting the original master then editing it down to the deliverable, then stamping out a new master. You get the IMF master and add the metadata that describes how to equate the desired deliverable. This single IMF deliverable can then be sent to everyone, and they simply dial up the “type” of deliverable they want.

It is hard to explain how “nirvana” this is unless you have experienced these issues yourself. Lets just say, Disney and Netflix are heavily invested in IMF for good reason.

The benefit is that it will be much easier for producers big and small to create deliverables to alternative markets.

More Info:
http://www.imfforum.com/
https://www.smpte.org/PDA_On-demand/IMF

 
HDR/EDR (High Dynamic Range/Extended Dynamic Range)

Image taken from DVInfo

Image demonstrates the wide level of light intensity in the real world compared to only 48nits (Max brightness) used in cinema

High Dynamic Range(HDR)/Extended Dynamic Range(EDR) is not a new emerging technology at SMPTE. We, as an industry, are only at the start or trying to define HDR. It will be one of the more challenging technical achievements the industry faces. With so many variables that affect HDR, I thought it would be a good idea to ask those who are on the front line of implementing HDR, the Camera and Post-tools companies, exactly how they see HDR taking form.

HDR is predominantly being pushed by TV manufacturers who want the be able to offer the next generations in TVs. 3D, 4K, HFR (High Frame Rate), TVs do all that now. So what is the next offering? HDR answers this need. This has put us in a very difficult position as to move to HDR, clear and defined workflows for producing content that everyone can create againt is required.

This topic deserves a deeper blog-post which I hope to address later.

CinemaCon and Laser projection

Finally, at CinemaCon I will be having a look at a plethora of Laser projectors coming to the market this year. What that means for Xenon lamps, and a new digital transition as Xenon lamps, like film, may die out. Plus its contribution to Rec.2020 colour space needed for HDR. Currently there is no displays that can actually reproduce Rec.2020 the latest colour space proposed.

This is an important issue as older colour spaces such and Rec.709 in typical HDTV we all watch today, were based on working screen technology at the time. Rec.2020 is more of a target. Its not currently possible to grade a film to Rec.2020, as you have no display that can reproduce it. This is one of the big conundrums of the future of HDR. Primary Laser projectors are the most promising technologies that will answer this need.

Lets find out at CinemaCon this year.

James Gardiner