HTPC Blu-Ray essentials
Page 5: post processing features
Noise reduction features of modern graphics cards try to reduce random noise. Such random (or "analog") noise in digitally compressed movies is a hot topic. The most common situation where this kind of noise occurs is in analog sources, e.g. analog broadcasts and VHS tapes.
When using digital sources, noise by nature is more rare because the transport/medium does not add any noise on its own. Instead any random noise in digitally compressed movies can only come from the original master. Now most movies were shot on film. And the film corn is usually visible, especially in dark/night scenes (due to higher ISO used in such scenes). This is called "film grain". Many end users see noise and grain as being identical things. But many home theater enthusiasts see film grain as being an integral part of the movie. This argument is supported by some computer animation movies where film grain was artificially added to make the film appear more "real".
Anyway, random noise reduction may be a matter of taste. One thing is sure: A graphics card should never force noise reduction on the end user without offering a way to turn it off completely.
Some movies are sharper than others. That leads to the temptation of trying to artificially make softer movies appear sharper. Some graphics card offer detail enhancement or sharpening algorithms to achieve this goal. Sometimes such algorithms can change the image quality in such a way that it may look more pleasing. On the other hand sharpening algorithms can also make random noise, compression artifacts and film grain more visible. So again I think that it's a matter of taste whether you like these algorithms or not. And again I strongly insist that no graphics card should ever enforce such an algorithm on end users without an option to turn it fully off.
adaptive contrast enhancement
There are a handful of movies where the black level is encoded rather high, so that dark/night scenes look more gray than black. In such cases a good adaptive contrast enhancement algorithm may help. However, in general such an algorithm should be kept turned off because by far most movies were carefully and correctly authored by the studios and trying to manipulate the movie contrast will only make things worse.
color "improvement" algorithms
Some graphics card makers seem to find it rewarding to offer many fancy algorithms like "fleshtone enhancements", "color vibrance" and what not. In practical life these features should be avoided like the plague. What we need is a correct output which adheres to the standards. Any color manipulation algorithms can only make things much much worse.
Unfortunately some of these algorithms can not be fully turned off in the control panel. There are some registry hacks available, though. So ATI scores 40 points.
Here NVidia finally scores full points.