ATI TV Wonder Elite Review :
  TheThirdMedia HardwareVideo GuideVideo Article > ATI TV Wonder Elite Review

ATI TV Wonder Elite Review

Date: 2005-3-7

[Abstract]
    Before we bring this review to a close, we should mention that Microsoft recently announced that the Imaging Science Foundation is going to be part of the certification process of TV tune...

[Content] PCDigitalMobileGame

ATI TV Wonder Elite Review
Boiling down the TV Tuner - Finding the Gold Standard
Let's dive a little deeper...

As we mentioned before, all of the extra data used to improve video quality is going to be moot if the software isn't up to par. It happens to be the same way with the video processor. If it isn't doing an efficient job it could be nullifying some of the critical data in the signal processing phase.

The problem in reviewing PVR cards is simply that it is often hard to measure the efficiency of the video processor unless we get into the actual algorithms employed. The other problem is that we can theorize all we want, but often times the TV experience is the bottom line, which can only be quantified when you actually watch TV, which of course is very subjective.  This doesn't mean that we can't try and find a "gold standard" to compare all the other PVR cards too, however.

We can make generalizations about a card's quality when we look at a 10-bit and 12-bit ADC based PVR cards, especially when they have custom TV viewing programs, like the scene between NVIDIA's Personal Cinemas and ATI's All-in-Wonders. These generalizations apply when we look at PVR compatible software suites, like Intervideo's WinDVR, SnapStream's Beyond TV and Cyberlink's PowerCinema. But it is also possible for the quality to be basically the same in a practical test, if say for example, Company X has poor coding in their PVR software that doesn't address the full potential of their hardware.

Image Scaling Considerations:

Although, there are a couple of ways that PVRs can be compared fairly. One would be the resolution of the PVR software window and the set desktop resolution. Often times, capturing a "frame" of broadcast television is done when the TV program is in windowed mode and takes up one quarter of a 1024 x 768 or 1280 x 1024 working space. Technically, this is the wrong way to test it.  Especially in a computer system, watching typical content usually mean having multiple windows open for multitasking. The problem with this when it comes to testing is that scaling of the video image occurs because it is no longer in its native TV resolution (approximately 480i).

Think of the problem this way. Say you took a picture with your digital camera at a resolution of 640 x 480 and you have your computer set to a resolution of 1280 x 1024. When you open up that picture and view that on a 1280 x 1024 environment, there is effectively scaling that occurs to "fit" that image approximately. The only way to see it in its true state is to set your desktop resolution to 640 x 480 and look at the picture full screen. Thus, this is the same thing that should be done when looking at PVR cards. The problem is that most people on Windows XP are limited to a minimum resolution of 800 x 600, which is what we are going to be forced down to. The reason this should be done is that we want to minimize the effect scalers have on TV quality.

This is basically what is happening with your TV signal, but the process is much more complicated because video scalers need to be programmed well so as not to destroy the integrity of the image. A good upscaler needs to be able maintain the original video's sharpness while avoiding the jagged edges and blurred lines that occur when you enlarge an image. With analog broadcast television, the process is much more dependant on upscaling, since most PCs have resolutions higher than the spec of your TV in your living room.

Scalers aren't the only thing that we need to keep in mind. There is also the GPU used in testing. In testing any computer component, the desire is to isolate the component being tested to determine its quality and performance in relation to other similar components. This is the same for PVR cards in the sense of not just CPUs, which effect utilization, but also the graphics card used. The GPU will actually be doing the work of de-interlacing the video signal, because you are effectively taking interlaced video content and rendering it on a progressive display (like a CRT or LCD).

ATI TV Wonder Elite Review

Obviously, not all GPUs are designed the same. This is highly apparent when we take a look at the performance of various video cards in gaming situations. This also applies to watching multimedia content, though to a lesser extent. Low end graphic chip solutions tend to either employ "weave" or "bob" de-interlacing methods, instead of a well coded adaptive algorithm techniques.

The Right Way To Judge Performance:

The point is that the same GPU along with other components will need to be used in comparison for PVR cards. All in one card solutions (graphics card + television tuner), like NVIDIA's Personal Cinema and ATI's All-in-Wonder line will be much harder to subject to a definitive quality assessment.

For example, if you have used NVIDIA's ForceWare Mutlimedia and ATI's MultiMedia Center, then you already know that frame capture software from third party companies like SnagIt and HyperSnap don't work well in comparisons of PVR software. They might capture the image, but it is more likely that they will capture the image somewhere in the processing phase. Taking a look at PVR cards is going to come down to two parts then (assuming test bed configuration is the same): the hardware specs and "eyeballing" TV quality with the included PVR software.

According to one PVR card maker, at the 16-bit ADC threshold, it is extremely hard for humans to distinguish the quality difference. We have been unable to find out how much of a premium a higher bit ADC contributes to the price of a PVR card, as the ADC is only one part of an audio/video processor, which also has other functions like comb filtering. Accordingly, the price of an audio/video processor will vary on many other factors, which eventually affect the chip's complexity and overall design and in turn that of the TV card.






[ Remark ] [ Print ] [ Font: Large Standard Small ]

Last News: eVGA vs Chaintech PCIe GF6600GT Comparison
Next News: MSI NX6600GT PCIe & AGP Review

Search News



 
Class Title
Home Page (0)
CPU Guide (959)
Chipset Guide (193)
Memory Guide (472)
Mainboard Guide (464)
Video Guide (1339)
Video Article (635)
Video News (704)
Storage Guide (410)
Multimedia Guide (736)
Mobile Guide (492)
Other HD Guide (2471)
 
Hot News
     
     
      >> Remark List   [Total 1 Remarks]
     
    Post Remark


    Remark: Letters0
    Name:   


      >> Related News      
     ATI RADEON X700 SE to Come  (2005-03-04)
     ATI Radeon RX800 Pro Roundup  (2005-03-03)
     ATI at GDC  (2005-03-03)
     Go With ATI and Win Contest Launched  (2005-03-03)
     ATI Introduces Ultra Cheap Graphics Cards  (2005-03-02)
     ATI Introduces New Graphics Chip  (2005-03-02)
     ATI Introduces Affordable Graphics Upgrade  (2005-03-02)
     ATI Announces HyperMemory RADEON X300 SE  (2005-03-02)
     New ATI Certification Program Streamlines Process for Game Developers  (2005-03-02)
     ATI Expands Offerings for Upgrade Market  (2005-03-01)
     ATI Announces AGP X850 XT and X800 XL  (2005-03-01)
     ATI Extends its Line of High-end Graphics Cards  (2005-03-01)
     ATI Claims Even Current Games Take Advantage of 512MB Frame-Buffers  (2005-02-28)
     ATI Certification Program Streamlines Process for Game Developer  (2005-02-26)
     ATI to Reveal 512MB Card  (2005-02-24)
     Next ATI card to have 512 MB?  (2005-02-23)
     ATI Linux Driver Update  (2005-02-20)