| Boiling down the TV Tuner - Finding the Gold Standard | Let's dive a little deeper... |
|
As we mentioned before, all of the extra data used to improve video quality is going to be moot if the software isn't up to par. It happens to be the same way with the video processor. If it isn't doing an efficient job it could be nullifying some of the critical data in the signal processing phase.
The problem in reviewing PVR cards is simply that it is often hard to measure the efficiency of the video processor unless we get into the actual algorithms employed. The other problem is that we can theorize all we want, but often times the TV experience is the bottom line, which can only be quantified when you actually watch TV, which of course is very subjective. This doesn't mean that we can't try and find a "gold standard" to compare all the other PVR cards too, however.
We can make generalizations about a card's quality when we look at a 10-bit and 12-bit ADC based PVR cards, especially when they have custom TV viewing programs, like the scene between NVIDIA's Personal Cinemas and ATI's All-in-Wonders. These generalizations apply when we look at PVR compatible software suites, like Intervideo's WinDVR, SnapStream's Beyond TV and Cyberlink's PowerCinema. But it is also possible for the quality to be basically the same in a practical test, if say for example, Company X has poor coding in their PVR software that doesn't address the full potential of their hardware.
Image Scaling Considerations:
Although, there are a couple of ways that PVRs can be compared fairly. One would be the resolution of the PVR software window and the set desktop resolution. Often times, capturing a "frame" of broadcast television is done when the TV program is in windowed mode and takes up one quarter of a 1024 x 768 or 1280 x 1024 working space. Technically, this is the wrong way to test it. Especially in a computer system, watching typical content usually mean having multiple windows open for multitasking. The problem with this when it comes to testing is that scaling of the video image occurs because it is no longer in its native TV resolution (approximately 480i).
Think of the problem this way. Say you took a picture with your digital camera at a resolution of 640 x 480 and you have your computer set to a resolution of 1280 x 1024. When you open up that picture and view that on a 1280 x 1024 environment, there is effectively scaling that occurs to "fit" that image approximately. The only way to see it in its true state is to set your desktop resolution to 640 x 480 and look at the picture full screen. Thus, this is the same thing that should be done when looking at PVR cards. The problem is that most people on Windows XP are limited to a minimum resolution of 800 x 600, which is what we are going to be forced down to. The reason this should be done is that we want to minimize the effect scalers have on TV quality.
This is basically what is happening with your TV signal, but the process is much more complicated because video scalers need to be programmed well so as not to destroy the integrity of the image. A good upscaler needs to be able maintain the original video's sharpness while avoiding the jagged edges and blurred lines that occur when you enlarge an image. With analog broadcast television, the process is much more dependant on upscaling, since most PCs have resolutions higher than the spec of your TV in your living room.
Scalers aren't the only thing that we need to keep in mind. There is also the GPU used in testing. In testing any computer component, the desire is to isolate the component being tested to determine its quality and performance in relation to other similar components. This is the same for PVR cards in the sense of not just CPUs, which effect utilization, but also the graphics card used. The GPU will actually be doing the work of de-interlacing the video signal, because you are effectively taking interlaced video content and rendering it on a progressive display (like a CRT or LCD).
Obviously, not all GPUs are designed the same. This is highly apparent when we take a look at the performance of various video cards in gaming situations. This also applies to watching multimedia content, though to a lesser extent. Low end graphic chip solutions tend to either employ "weave" or "bob" de-interlacing methods, instead of a well coded adaptive algorithm techniques.
The Right Way To Judge Performance:
The point is that the same GPU along with other components will need to be used in comparison for PVR cards. All in one card solutions (graphics card + television tuner), like NVIDIA's Personal Cinema and ATI's All-in-Wonder line will be much harder to subject to a definitive quality assessment.
For example, if you have used NVIDIA's ForceWare Mutlimedia and ATI's MultiMedia Center, then you already know that frame capture software from third party companies like SnagIt and HyperSnap don't work well in comparisons of PVR software. They might capture the image, but it is more likely that they will capture the image somewhere in the processing phase. Taking a look at PVR cards is going to come down to two parts then (assuming test bed configuration is the same): the hardware specs and "eyeballing" TV quality with the included PVR software.
According to one PVR card maker, at the 16-bit ADC threshold, it is extremely hard for humans to distinguish the quality difference. We have been unable to find out how much of a premium a higher bit ADC contributes to the price of a PVR card, as the ADC is only one part of an audio/video processor, which also has other functions like comb filtering. Accordingly, the price of an audio/video processor will vary on many other factors, which eventually affect the chip's complexity and overall design and in turn that of the TV card.