HTPC Blu-Ray essentials

Page 3: PC levels vs. video levels

what are PC levels and video levels?

There are two different types of displays/monitors: Computer monitors and consumer displays ("TVs"). Both expect the video data in a slightly different way when receiving an RGB video signal. A computer monitor normally expects a "black" pixel to have the color information set to 0. And a "white" pixel is expected to have a value of 256. In contrast a TV usually expects "black" to have a value of 16 while "white" is expected at 235.

Huh? If black is supposed to be 16, then what meaning does the area between 0-15 have? This area is called "blacker than black" (BTB). And the area between 236-255 is called "whiter than white" (WTW). A TV usually shows all values between 0-16 as pure black and all values between 236-255 as pure white.

Having the colors interpreted as a computer monitor does is called "PC levels". Having the colors interpreted as a TV does is called "video levels". You may wonder what the sense of the "video levels" logic is. The "PC levels" logic sounds more reasonable, right? But discussing the sense of the "video levels" logic is really outside the scope of this article. For now you only need to know that there is a difference between how computer monitors and TVs interpret video signals.

what does a graphics card need to do

A graphics card usually does not know how the connected display interprets the video signal. So the graphics card also doesn't know whether it should send the Blu-Ray movies with "PC levels" or with "video levels". But as you can probably understand it's important that the output of the graphics card matches the expectations of the display. When watching a Blu-Ray movie on a computer monitor, PC levels should be used. When watching a Blu-Ray movie on a TV, video levels should be used.

So the only reasonable way to solve this problem is this: The graphics card *must* offer a switch to the end user to define for every connected display whether PC levels or video levels should be used.

Of course the graphics card should avoid doing multiple conversions between video and PC levels. The general rule for the graphics card should be to limit any processing to the absolute necessary minimum. Because every processing, especially any conversion between color spaces, results in a slight loss of image quality.

Finally, it's important that the use of PC levels vs. video levels is consistent between different renderers (e.g. VMR9 vs. Overlay vs. EVR), different sources (SD vs HD), different media players and independently of whether hardware acceleration is used for video decoding.

what happens if the graphics card outputs the wrong kind of levels

If a computer monitor is fed with video levels, black parts of the image will be shown as dark gray by the display. And white parts of the image will be shown as light gray. Basically the image will look washed out and lacking in contrast.

If a TV is fed with PC levels, dark gray becomes black and light gray becomes white. So all detail in dark and light areas will be lost. The contrast will look "pumped up". This can actually look pleasing on a first glance. But after a careful comparison it will become clear that the detail loss in dark and white image areas is really hurting the image quality.

ATI rating

There are several problems with ATI cards: With SD content ATI usually outputs video levels. With HD content PC levels are used. This in itself is already bad. Why treating SD and HD content differently!? Basically ATI requires us to recalibrate our displays everytime we switch from an SD source to an HD source!

But it gets even more interesting: Some ATI cards ship with a so called "dongle" which essentially is a simple DVI->HDMI adapter. Well, it's not so simple. Because if the ATI adapter is used, the ATI card suddenly outputs YCbCr instead of RGB! In theory this is not a bad idea: YCbCr is supported by most displays and doesn't have this PC levels vs. video levels problem. Furthermore almost all movie/video sources are stored in YCbCr and not in RGB. So it sounds only logical to output them in YCbCr. But in the end all is not well because of several reasons:

  • Only some cards ship with that dongle. If you use a normal DVI->HDMI adapter you're stuck with RGB and the dreaded PC vs. video levels problem.
  • The BTB/WTW information is missing in ATI's YCbCr output. This strongly indicates that multiple conversions have been going on. The YCbCr source data was most probably converted to RGB with PC levels and then converted back to YCbCr. No, thanks. We don't want that. YCbCr output would be lovely - but only if RGB was never ever part of the processing chain. If video is converted to RGB, anyway, then please don't do just another conversion to YCbCr!

Finally, I've noticed that when not using the ATI dongle, in "windowed mode" my media player uses PC levels while in "full screen mode" it uses video levels. But if I change the video output mode while the video is playing, ATI switches back to PC levels even in "full screen mode". Furthermore there are reports that levels can also vary depending on whether hardware acceleration for video decoding is used or not.

So in the end we have to rate ATI with 0 points in this section cause there's simply no proper way to choose which levels ATI is using.

NVidia rating

The situation with NVidia is not fully clear. Reading in the HTPC forums it seems that different people have totally different experiences. For some people NVidia uses PC levels, for other people video levels. It seems to depend on the OS being used, on the media player and also on the driver revision.

In XP with some older drivers there was a registry switch available which allowed the user to switch between PC levels and video levels. But this switch doesn't seem to work in Vista with newer drivers, anymore.

Some people reported that their driver control panel offered them to switch between video and PC levels, but other people were not able to confirm that with the same driver version.

Furthermore it seems that even when video levels are used, the BTB and WTW information is cut off in the later driver versions

All in all there are so many inconsistencies that every person would likely result in a different rating of their NVidia card, depending on which OS, driver version and media player they're using. I think I have to rate NVidia in Vista with 0 points. I'll give them 20 points in XP due to that registry hack (50% penalty due to clipped BTB/WTW).
 

Page 1 - Page 2 - Page 3 - Page 4 - Page 5 - Page 6 - Page 7
points (out of 80)
ATI XP 0
ATI Vista 0
NVidia XP 20
NVidia Vista 0

rating guidelines:

control panel switch available80 points
registry hack available40 points
not adjustable0 points

If BTB and/or WTW information is cut off when outputting video levels, there's a 50% point penalty.

If the behaviour differs between SD and HD content or between different renderers (VMR7, VMR9, EVR, Overlay), there's a 50% point penalty.