Wednesday, January 31, 2024

Importance of localization in File based QC systems

 

Introduction

Performing Quality Control on broadcast content has always been a very important element of any broadcast workflow. With the advent of OTT and the high expectation of discerning viewers of today, QC has become a critical aspect of content processing workflows. QC is the secret recipe for verifying the visual/audio quality of great content and can be a deciding factor between a bad experience, a decent experience and an amazing user experience. The value added by the QC process to the entire content value chain from creation to post-production, to broadcast, till consumption cannot be ignored. The proliferation of video content across formats, mediums, and geographies makes manual QC nearly impossible. Automated QC tools have become a necessity.

Need of Localization in QC Software

The content industry is spread across geographies and as such needs to cater to users across the globe. To keep up with this need, file based QC has to not only broaden but also deepen its reach. Majority of the QC tools are used by local operators and professionals. If a show is being aired in Japan, what do you think will be the primary means of communication between most of the stakeholders involved? In any country where English is not the first or the predominant language, this will always be an issue. QC systems will need an entire overhaul to better serve the local professionals. It’s not enough to simply provide training in the local language but every aspect of the software needs to be localized. Localization cannot be a half-measure; it needs to go the whole nine yards. A file-based QC software is a crucial support system for the providers to churn out engaging and flawless videos to develop and retain their user base. As such, the ‘users’ of a QC software are the professionals working and dependant on the software. Most people are comfortable in their mother tongue or natural language. They tend to think in that language and a localized version of the software will definitely be more accommodating of such psychological and subliminal needs.

Venera’s QC Systems come equipped with Localized UI

The Venera file-based QC systems – Pulsar (for on-premise) and Quasar (for cloud), are already moving in this direction as we have implement a localized UI system in some languages such as Japanese and Korean. A version of the software in the local language enables easier access and allows users to fully exploit the functionality of the software. Furthermore, the reporting system, the second aspect of the file based QC software, should also be localized. As you can imagine, a QC report needs to be shared not only within the organization but also with several stakeholders across organizations that may be local or spread across the globe. This necessitates a fully functional multilingual QC system that can ‘talk’ to stakeholders in several languages without loss of data or communication. Sound far-fetched? Think again! We have already implemented the localization framework to accommodate non-English languages. And Japanese and Korean languages have already been added, with more localized languages coming. For us, localization is not just a word, it’s the path ahead.

We would welcome to hear your thoughts on the importance of Localization in QC systems. Does this feature make your life easier? Any other languages that you would want us to include? Feel free to share your feedbacks in the comments section below.

If you want to have a deeper insight into our QC solutions (Pulsar & Quasar) with this localization capability, do get in touch with us or request a free trial at sales@veneratech.com. You can read more about our solutions at:-

Pulsar – Pulsar

Quasar – Quasar



Tuesday, January 30, 2024

HDR Insights Article 3: Understanding HDR Tone Mapping

 


Introduction

In the previous article – HDR Transfer Functions, we discussed the transfer functions and how digital images are converted to light levels for display. This article discusses how the same HDR image can be displayed differently by different HDR devices.

What is HDR Tone Mapping?

Tone mapping is the process of adapting digital signals to appropriate light levels based on the HDR meta-data. This process is not simply applying the EOTF (Electro-Optical Transfer Function) on the image data but it is rather trying to map the image data with the display device capabilities using meta-data information. Since a broad range of HDR display devices are available in the market, each with their own Nits (i.e. ‘brightness’) range, correct tone mapping is necessary for a good user experience. Since the tone mapping is done based on the meta-data in the video stream, presence of correct meta-data is necessary.

Source footage can be shot at HDR with best of cameras and then mastered on high-end HDR mastering systems, but it still need to be displayed optimally on the range of HDR televisions available in the market. Tone mapping performs an appropriate brightness mapping of the content to device without significant degradation.

Need for HDR Tone Mapping

Let’s say an image is shot with peak brightness of 2000 Nits. If it is displayed on a television with 0-2000 Nits range, the brightness range will be exactly as shot in the raw footage. However, the results will be different on other devices:


Since tone mapping is a necessary operation to display PQ based HDR content on HDR display devices, the television needs to know the native properties of the content in terms of the brightness range used along with mastering system parameters. This information is conveyed in the form of HDR meta-data. After reading the HDR meta-data, display devices can decide the tone mapping parameters so that the transformed video lies optimally within the display range of the display device.

Next article will discuss the specific meta-data for HDR-10 and HDR-10+, two different implementation of the HDR. Stay tuned for that.

Article 2: Transfer functions

Definitions

cd/m2 – The candela (cd) is the base unit of luminous intensity in the International System of Units (SI); that is, luminous power per unit solid angle emitted by a point light source in a particular direction. A common wax candle emits light with a luminous intensity of roughly one candela.

Nits – A non-SI unit used to describe the luminance. 1 Nit = 1 cd/m2.

HDR – High Dynamic range. It is a technology that improves the brightness & contrast range in an image (upto 10,000 cd/m2)

SDR – Standard Dynamic range. It refers to the brightness/contrast range that is usually available in regular, non-HDR televisions usually with range of upto 100 cd/m2. This term came into existence after HDR was introduced

WCG – Wide Color Gamut. Color gamut that offer a wider range of colors than BT.709. DCI-P3 and BT.2020 are examples of WCG offering more realistic representation of images on display devices.

EOTF – electo-optical transfer function. A mathematical transfer function that describes how digital values will be converted to light on a display device.

OETF – optical-electro transfer function. A mathematical transfer function that describes how the light values will be converted to digital values typically within cameras.

OOTF – opto-optical transfer function. This transfer function compensates for the difference in tonal perception between the environment of the camera and that of the display.

PQ – PQ (or Perceptual Quantizer) is a transfer function devised to represent the wide brightness range (upto 10,000 Nits) in HDR devices.

HLG – HLG (or Hybrid Log Gamma) is a transfer function devised to represent the wide brightness range in HDR devices. HLG is quite compatible with existing SDR devices in the SDR range.


Originally Published at:- https://www.veneratech.com/what-is-hdr-tone-mapping/

Monday, January 29, 2024

HDR Insights Article 2 : PQ and HLG transfer functions for HDR

 


Introduction

In the previous article HDR introduction, we discussed the benefits HDR (High Dynamic Range) brings about in terms of quality of the video. This article talks about how that is achieved.

To display the digital images on the screen, display devices need to convert the pixel values to corresponding light values. This process is usually non-linear and is called EOTF (Electro-Optical Transfer Function). Different types of “Transfer Functions” are supported in different display devices.

Regular HDTV display devices (SDR – Standard Dynamic Range – monitors) normally use BT.709 Gamma transfer function to convert the video signal into light. These monitors are primarily designed to display images with brightness range of up to 100 Nits (cd/m2).

High Dynamic Range – Transfer Functions 

(PQ & HLG)

HDR defines two additional transfer functions to handle this issue – Perceptual Quantizer (PQ) and Hybrid Log-Gamma (HLG). HDR PQ is an absolute, display-referred signal while HDR HLG is a relative, scene-referred signal. This means that HLG enabled display devices automatically adapts the light levels based on the content and their own display capabilities while PQ enabled display devices need to implement tone mapping to adapt the light levels. Display devices use content metadata to display PQ coded images. This can come once for the entire video stream (static) or for each individual shot (dynamic)

It is expected that under ideal conditions, dynamic PQ based transformation will achieve the best quality results at the cost of compatibility with existing display systems. Please see examples below:

 



HDR – Signal to light mapping

 

The graph below describes the mapping of light levels for various transfer functions. Vertical axis shows the signal values on a scale of 0-1 with 0 being black and 1 being white. This is done to make the signal range, bit depth agnostic. Horizontal axis shows the light level in Nits of display device.



Human beings are more sensitive to changes in darker region compared to changes in brighter regions. This property is also exploited in HDR systems providing more granularity in darker regions compared to brighter regions. The graph above depicts that light level range in darker region is represented by a larger signal value range compared to the brighter regions – meaning more granular representation in darker regions. While this is more evenly distributed for the BT.709 based displays, it become less granular for HDR displays in the brighter regions. In case of HLG, more than half of signal values are represented for light level between 0-60 Nits and the remaining signal values are represented in 60-1000 Nits range. Similarly, in case of PQ ST2084 based displays, approx. half of the signal values are represented for light level between 0-40 Nits and the remaining half of signal values are represented in 60-1000 Nits range.

According to the graph, HDR HLG is similar to BT.709 in lower brightness regions therefore offering a better compatibility with the existing SDR display devices. However, HDR PQ is quite different from BT.709. If we try to display the PQ HDR image on a SDR display, darker regions represented by PQ will invariably become brighter thereby reducing the contrast levels of the image, the result being a washed out image (see below).


HLG based image looks much better on a SDR monitor:



While PQ based transforms offers promise to display best quality results on HDR enabled monitors, in comparison to HLG, it requires proper tone mapping by display devices.


This topic will be discussed in our next blog article – Tone mapping.


Definitions


cd/m2 – The candela (cd) is the base unit of luminous intensity in the International System of Units (SI); that is, luminous power per unit solid angle emitted by a point light source in a particular direction. A common wax candle emits light with a luminous intensity of roughly one candela.


Nits – A non-SI unit used to describe the luminance. 1 Nit = 1 cd/m2.


HDR – High Dynamic range. It is a technology that improves the brightness & contrast range in an image (up to 10,000 cd/m2)


SDR – Standard Dynamic range. It refers to the brightness/contrast range that is usually available in regular, non-HDR televisions usually with range of up to 100 cd/m2. This term came into existence after HDR was introduced


WCG – Wide Color Gamut. Color gamut that offer a wider range of colors than BT.709. DCI-P3 and BT.2020 are examples of WCG offering more realistic representation of images on display devices.


EOTF – electro-optical transfer function. A mathematical transfer function that describes how digital values will be converted to light on a display device.


OETF – optical-electro transfer function. A mathematical transfer function that describes how the light values will be converted to digital values typically within cameras.


OOTF – opto-optical transfer function. This transfer function compensates for the difference in tonal perception between the environment of the camera and that of the display.


PQ – PQ (or Perceptual Quantizer) is a transfer function devised to represent the wide brightness range (up to 10,000 Nits) in HDR devices.


HLG – HLG (or Hybrid Log Gamma) is a transfer function devised to represent the wide brightness range in HDR devices. HLG is quite compatible with existing SDR devices in the SDR range.

 

 



Wednesday, January 24, 2024

Delivering to NETFLIX? QC Requirements in A Nutshell and How to Comply with Them

 



Introduction

It is a well-known fact that Netflix is very conscious of the quality of content that is delivered via their service. Whether it is the overall Audio/Video quality or the structural compliance of the content delivery packages, all of it needs to be in compliance with their technical specifications before it can be accepted by Netflix.

Becoming a Netflix Preferred Fulfillment Partner – NPFP, or being part of the Netflix Post Partner Program – NP3 is a tough task and continuing to remain a partner is also not easy, requiring consistent attention to quality. Netflix maintains the track record of the partners and the failure rates are published on its website from time to time.

It is therefore pertinent for the Netflix partners to ensure the compliance of their content before delivering. Here is a list of some of the common areas of QC that suppliers need to pay attention to before delivering their content.

1. IMF Analysis: Netflix requires most of its content to be delivered in IMF packages. That means you need to verify the accuracy of your IMF packages before delivery. This includes the basic compliance with IMF Application 2E SMPTE standards and specific validations on asset maps, packing list and other package elements.

2. Dolby Vision: Increasingly more content is now being delivered in HDR and Netflix has selected Dolby Vision as its HDR format of choice. This requires you to ensure the basic Dolby Vision compliance along with specific structure recommendations outlined in the Netflix specifications.

3. Photon: Netflix also requires for your IMF packages to pass their own ‘Photon’ IMF tool before delivery. These checks are performed while uploading the package to Netflix. If Photon fails the asset then the content will not be sent to Netflix.

4. Harding PSE: Detecting video segments that may cause Photo Sensitive Epilepsy (PSE), particularly for content that is being delivered to the UK and Japan, is becoming very important. Netflix may require PSE validation for certain category of content.

5. Audio/Video baseband quality: The content must be thoroughly checked for a wide range of artifacts in the audio/video essence before delivery.

Many of the above items are difficult and/or time-consuming to perform with manual QC and therefore warrant the use of a QC tool. Venera’s QC products (Pulsar – for on-premise & Quasar – for cloud) can help in identifying these issues in an automated manner. We have worked extensively with the IMF User Group, and Dolby and Netflix teams to create the features that do what user needs, and have done so without introducing a lot of complexity for the users.

We have also integrated the industry-standard Harding PSE engine and can generate a Harding certificate for every file processed through our Pulsar & Quasar file QC tools. And the Netflix Photon tool has also been integrated so that you can receive ONE QC report including the Photon messages as well.

The results are provided in the form of XML/PDF reports for easy assessment. If desired, the Harding certificate and the QC reports (which will include the Photon results) can even be shared with Netflix along with the delivered content.

Pulsar – on premise File-based Automated QC

Depending on the volume of your content, you could consider one of our Perpetual license editions (Pulsar Professional, or Pulsar Standard), or for low volume customers, we also have a very unique option called Pulsar Pay-Per-Use (Pulsar PPU) as an on-premise usage-based QC software where you pay only $15/hr for content that is analyzed. And we, of course, offer a free trial so you can test our software at no cost to you. You can also download a copy of the Pulsar brochure here.

 

Quasar – Native Cloud File QC Service

 


If your content workflow is in the cloud then you can use our Quasar QC service, which is the only Native Cloud QC service in the market. With advanced features like usage-based pricing, Dynamic scaling, Regional resourcing, content security framework and REST API, the platform is a good fit for content workflows requiring quality assurance. Quasar is currently support for AWS, Azure and Google. Read more about Quasar here.

Both Pulsar & Quasar come with a long list of ‘ready to use’ QC templates for Netflix, based on their latest published specifications (as well as some of the other popular platforms, like iTunes, CableLabs, and DPP) which can help you run QC jobs right out of the box. You can also enhance and modify any of them or build new ones! And we are happy to build new QC templates for your specific needs.


Originally Published at :- https://www.veneratech.com/delivering-to-netflix-qc-requirements-in-a-nutshell-and-how-to-comply-with-them/


Monday, January 22, 2024

Quasar Leap – We went where no Cloud-QC service had gone before!

 


Introduction

We have achieved something that has been near and dear to my heart for a couple of years, as it relates to our Cloud-based QC capabilities. And that is how we extended the capability of our Quasar native-cloud QC service to the level that I don’t believe anyone else has actually reached! And we call it “Quasar Leap”.

When we started the development work on Quasar®, our native cloud QC service, the goal was clear. We didn’t want to just take our popular on-premise QC software, Pulsar™, run it on a VM and call it “cloud” QC. We made the deliberate decision that while we would use the same core QC capabilities of Pulsar, we would build Quasar architecture from grounds up, to be a ‘native’ cloud QC service.  And we did accomplish that by being the first cloud-based QC to legitimately call ourselves ‘native’ cloud. And the phrase ‘native’ cloud meant capabilities like microservices architecture, dynamic scalability, regional content awareness, SaaS deployment, usage-based pricing, high grade content security, and of course redundancy.

But we wanted to go even further. And that was when the project we code named ‘Quasar Leap’ came about. To borrow and paraphrase from one of my all time favorite TV shows, Star Trek, we wanted to “take Quasar to where no cloud-QC had gone before”! (Those of you who are Star Trek fans know what I am talking about!).

Quasar was already able to process 100s of files at the time, but the goal of ‘Quasar Leap’ was to show that Quasar can process ONE THOUSAND files simultaneously! Of course, anyone can claim that their solution is robust, scalable, reliable, etc, but we set out to actually do it and then record it to prove that we did it!

This was not a marketing ploy, although to be honest, I knew there would be great appreciation and name recognition telling our customers and prospects that we can QC 1,000 files simultaneously. But there was a practical and quite useful benefit of doing so. After Quasar’s initial release, we found out when we started to push the boundaries of how many files Quasar could process simultaneously, there were some practical limitations to our architecture, even though we were already way ahead of our competition. And while we could easily process a few hundred files at the same time (more than any of our customers had needed), when we tried to push beyond that, the process could break down, and impact the reliability of the overall service.

So because of project ‘Quasar Leap’, our engineering team took a very close look at various components of our architecture. And while I am obviously not going to give away our secret sauce (!), suffice to say, they further enhanced and tweaked various aspects of our internal workflow to remove any bottlenecks and stress points, to make Quasar massively and dynamically scalable!


And then we decided instead of just ‘saying’ that we have the most scalable native-cloud QC solution, we would ‘actually do’ it and record it!! And I can now tell you confidently that we have actually done that! That, we actually submitted 1000 60-minute media files, watched our Quasar system dynamically spin up 1000 AWS virtual computing units (called EC2s in their terminology), and process (QC) those 1000 files simultaneously, and then spin down those EC2 instances once they were no longer needed.

With the new scalling capabilities, 60,000 minutes (1000 hours!) of content can be processed in about 45 minutes which even includes the time of spinning up of the EC2 instances!  To put things in perspective, 1000 hours of content, equivalent of approximately 660 movies, or 20 seasons of 7 different popular TV Sitcoms, can be processed in just 45 minutes! To say it differently, with our massive simultaneous processing capability, approximately 1300 hours of content can be processed in one hour!

If you say to yourself “that is great, but who has that much content that they need to process them that quickly”, I have an answer! Actually a three-point answer:

You will be surprised how many media companies have PETA bytes (that is with a “P”!) of content sitting in cloud storage! They face the daunting task of managing a cloud-based workflow to monetize that archived content by restoring, ingesting, transcoding, and ultimately delivering that content to audiences. And one step that is naturally important in this workflow is ensuring that all of that content goes through content validation at various stages before delivery to the end user. And that is where this massive simultaneous QC processing ability of Quasar will be much needed to minimize delays in this effort.

1.Some of our customers get content in bursts with strict delivery timelines. Ability to process that burst of content immediately offers significant business value in addition to workflow efficiency.

2.And let’s not forget the main gain from the ‘Quasar Leap’ project, which was the behind the scene tweaking, in some cases revamping, and enhancing of our underlying architecture. And that has resulted in a solid platform, which will benefit ALL of our Quasar SaaS users, whether they have 100,000 files or 100 files, or even a few files! It ensures reliability, scalability and confidence in that they can rely on Quasar to meet their QC needs regardless of their normal volume or any sudden increases (bursts) in their content flow due to an unexpected event or last minute request.

All that effort for ‘Quasar Leap’ by our talented and dedicated development team, conducted during the challenging time of the pandemic, is finally complete! The new release of Quasar with all the architectural changes resulting from the ‘Quasar Leap’ project has rolled out.

According to Tony Huidor, SVP, Products & Technology at Cinedigm, a premier independent content distributor, a great customer of ours, and an early benefactor of ‘Quasar Leap’: “Given the rapidly growing volume of content that uses our cloud-based platform, we needed the ability to expand the number of files we need to process in a moment’s notice. Quasar’s massive concurrent QC processing capability gives us the scalability we required and effectively meets our needs.”

And now ‘Quasar Leap’, giving us the ability to massively scale up our simultaneous processing capability, is ‘live’, and “our Quasar native-cloud QC has gone where no other cloud-QC has gone before”!

Learn more about our Quasar capabilities here, or contact us for a demo and free trial!

And as Mr. Spock would say: “Live Long & Prosper”!

Originally Published at:- https://www.veneratech.com/quasar-leap-scalability-cloud-qc-service/


Automated Validation of QR Codes

  QR code or Quick Response code is a two-dimensional barcode invented in 1994 by a Japanese company Denso Wave for labelling automobile par...