Forums

Topic: Dolby Vision for Games: Decoded

Posts 1 to 1 of 1

GamingFan4Lyf

Introduction:

So, for a while I had been using Dolby Vision for Games on Xbox Series X and, for the most part, have enjoyed the output. However, I somewhat recently came across some YouTube vides about how Dolby Vision for Games never quite achieves the same level of Brightness as using HDR10 with HGIG Setting enabled (or Game HDR as my TV calls it) - more on this later.

So, I did a little experiment with Alan Wake II and found that when turning off Dolby Vision for Games and just sticking with HDR10 (properly calibrated with the HDR Brightness Calibration App with Game HDR enabled on my TV and turning the in-game HDR slider up to a closer match to my TV's advertised peak brightness), the highlights appeared to pop more than Dolby Vision for Games using the same settings.

I started to wonder why this was as one would think that Dolby Vision for Games would simply adjust the HDR Peak brightness based on TV capabilities.

Perhaps in order to answer these questions, I need to clear up a few technical aspects of HDR (from my understanding as a non-professional) as well as how TV's display HDR content.

HDR10 & Dolby Vision - Film vs. Video Gaming Implementations:
I won't go into the full breakdown of HDR - I would hope most people know what it is by now. Short and sweet, it's brighter brights, darker darks and every value of brightness in between as well as at least 10-bit color - therefore giving you a higher dynamic range of brightness and color (neat, huh?)

There is a big difference between how TV & Film uses HDR10 and Dolby Vision vs. how video games use HDR10 and Dolby Vision.

For film:

HDR10 is what is called "static". The film is mastered with brightness values between 0 and 1000 nits with zero change to those values. Your TV does what is called Tone Mapping in order to ensure that those 0 - 1000 nit values fit within the boundaries of your TV's capabilities, but that comes with some drawbacks in the form of over-saturation of brights and what is called black crush.

Why does this happen?

The best way is to give you a scenario:

Let's say your TV can handle 600 nits. Now let's say you have a light gradient that starts at 1000 nits and ends at say...400 nits. Tone mapping will display 1000 nits as 600 nits, 900 nits as 600 nits, 800 nits as 600 nits, 700 nits as 600 nits and then properly start showing proper gradients from 600 nits down to 400 nits. There are 400 nits of brightness that the TV can't decipher because it exists outside of the boundaries of your TV but is tone-mapped down to the highest setting. Essentially, it will look like a bright ball of light until it finally gets to a point your TV can decipher the various gradients.

Now lets look at the opposite side of the spectrum - black crush. Black crush is more prevalent with LCD displays due to lack of "true black", where OLEDs can turn individual pixels off and have much better resolution of darker images.

It basically works in the opposite way. Let's say your TV can't reach 0 nits and can only start to refine darker details around say 5 nits. Any nit value between 0 and 5 is just pure black and values about 5 start to show detail.

Dolby Vision, on the other hand, is dynamic. Basically, the film is mastered in a way where max and low light settings are provided for the film between 0 and up to 10,000 nits in the form of metadata that acts as more of a guide for your TV to adjust those values on a frame-by-frame basis to match your TV's capabilities. Not only that, it can also better calculate other varying degrees of brightness levels to avoid both over-saturation of brightness and black crush.

So, let's get back to our 1000 nit gradient light but with Dolby Vision and our 600 nit TV. Dolby Vision will still display that peak brightness at 600 nits, but will also mathematically calculate the gradients and drop the nit values accordingly so that you can see all of the detail in that light - or at least as best your TV can display since Dolby Vision is only as good as your TV's capabilities.

So, for example, 1000 will be 600, 900 would be 580, 800 would be 560, 700 would be 540, 600 would be 520, and finally 500 would show up as 500 and the gradient would continue down its regular progression. Sure, the gradient will look more refined at 500 and under, but at least there would be some gradient rather than one giant ball of light.

The same will happen to eliminate black crush. 0 will start at 5 nits and it will adjust the darkness values to ensure finer details in dark areas is resolved.

Essentially, Dolby Vision adjusts the actual HDR values of the picture to your TV's capabilities rather than your TV attempting to adjust to the HDR values of the picture like it does with tone mapping with HDR10 content. Also, the Dolby Vision software creates HDR10 data for backwards compatibility purposes - which is why when you run a movie on a Dolby Vision movie on a TV that doesn't support it, HDR10 is still available. Clear as mud?! Great!

For Games:

So, let's talk about how this differs from video games. It really boils down to one simple thing - settings menus. Over-saturation and black crush could mean the difference between life and death in a video game, so it's important to have the most optimal settings to avoid these things. Simply doing what is done with movies won't work for games.

The HDR Gaming Interest Group (or HGIG) was created to figure out how to make this process simple. Their solution was adding the ability to adjust the optimal settings using sliders within the system menus. What you may not know is that both Microsoft and Sony had a hand in the creation of the HGIG - hence why both PlayStation and Xbox have HDR calibration menus in the system settings. I love it when people come together to solve a common problem.

Unfortunately, TV tone mapping can get in the way of this optimal picture, so many TVs now have what is called HGIG Mode (or Game HDR) that actually turns off the tone mapping the TV uses for film HDR so that the system settings and calibration take over the duties of maintaining the proper peak brightness and darkness settings.

It's going to be this particular aspect that I will focus on regarding optimizing Dolby Vision for Games - so I hope you are still with me!

The Dolby Vision for Games Dilemma:

Most games aren't actually mastered in Dolby Vision and, instead, rely on a Dolby Vision upscaler provided by Dolby itself to turn HDR10 values into Dolby Vision metadata. In that sense it's not "real" Dolby Vision like a game that is mastered using the Dolby Vision for Games SIDK.

One silver lining though is that the Dolby Vision for Games upscaler is an opt-in feature for developers rather than an opt-out feature. The visual output is verified by developers to ensure it isn't messing up the intended look of their game. That doesn't mean it's necessarily a 1:1 match, but at least we know someone involved with the game's creation verified that the Dolby Vision upscaler output looked great and "checked the box" to allow its use.

This is also probably the reason why many games aren't mastered in Dolby Vision as the upscaler does such a great job out of the box there isn't a reason to spend extra cash on the Dolby SIDK to master a game in Dolby Vision - but that's a whole other story.

Not only that, but many games don't support HGIG guidelines and ignore the system calibrated HDR in favor of having their own in-game sliders. This essentially could create issues with how the Dolby Vision upscaler works. Perhaps algorithmically, HDR10 could be perceived as weaker on the Dolby Vision upscaler simply due to algorithmic reasons.

Sure, the in-game slider may say a game is running at a peak brightness of the 600 nits that your TV is supposed to handle, but perhaps the Dolby Vision upscaler further scales that down as it thinks it's adjusting accordingly - remember this attempting to utilize metadata for your TV from a non-metadata source.

Also, lets keep in mind that some of these sliders don't provide nit descriptions. Sometimes it's like a 0 - 1.00 or 2.00 or some random number like that. It tells us absolutely nothing about where the HDR should be. Sometimes, all we have a picture and instructions telling us where to set the slider bar - so what are the nit values for that?!

What if the in-game sliders and calibration is more like 400 - 500 nits (or lower) and the Dolby Vision for Games is adjusting from there and looks "weaker" because of that?

The solution:

I decided to conduct another experiment and it has completely changed how Dolby Vision for Games looks on my TV.

First, I went into the HDR Calibration App on my Xbox and Pressed All 4 Shoulder buttons to bring up the Min/Max numbers for this app. I turned my lowest levels to 0 and all the Highest numbers to 10000.

Then, within the game I cranked the Peak Brightness all the way to the max where applicable - I have done this for Resident Evil 4 and Alan Wake 2 so far.

The idea is that I am essentially taking any potential error in slider calibration out of the equation and letting the Dolby Vision for Games do its job: use the full spectrum of available data and scale to my TV's specifications.

If the max peak brightness of a game ends up being 1000 nits and my TV can only handle 600 nits, then Dolby Vision should adjust the picture accordingly to those 600 nits rather than some unknown max value that could actually be lower than my TV handles - basically the exact same way that Dolby Vision works for a film.

I have noticed not only an increase in peak brightness when using Dolby Vision, but I also don't notice any more HDR10 over-saturation (despite following the guides for the "optimal" HDR experience for a game): I noticed fire in Resident Evil 4 looks like a single bright blob when using "calibrated" HDR10, but actually has slightly more varying degrees of brightness when using Dolby Vision for Games.

I don't mention Black Levels improvements because I always felt like Dolby Vision for Games did a good job out of the box with eliminating Black Crush and still keeping things pretty dark when necessary.

I don't have any professional calibration tools or HDR measurement equipment to know if what I am doing makes sense, but I will say Dolby Vision for Games never looked better and I don't ever have to worry about those stupid sliders in games ever again on Xbox.

I'd be interested to see others with Dolby Vision capable TVs to do this as well and get your opinions on if you notice better Dolby Vision results.

For context my TV is a 55" Vizio M7 Quantum LCD with 32 Local Dimming Zones (to attempt at getting that "true black"). It supports up to 600 nits.

Thanks for reading!

Edited on by GamingFan4Lyf

GamingFan4Lyf

Xbox Gamertag: Bioflare

  • Page 1 of 1

Please login or sign up to reply to this topic