Power Consumption NVIDIA claims that the 7800 GTX consumes 10 watts less under load than the 6800 Ultra. To be fair, we don't actually have an authentic GeForce 6800 Ultra in PCIe around -- we used an existing 6800 GT and overclocked its core by 50 MHz to "simulate" a 6800 Ultra. Performance-wise, they will be identical, but I can't say the same about power consumption, as I'm not sure about the stock voltages supplied to the GPUs on reference 6800 GTs vs. 6800 Ultras.
I used a Kill-a-Watt power meter to test the total power consumption of our test system. Running Doom 3 at 1600x1200 with 4x AA/8x AF seemed to be the most power-hungry combination. I ran the Doom 3 time demo twice while looking at the Kill-a-Watt, mentally noting the figure at which the most time was spent. In both cases, the peak power consumption level was 2-3W higher than the most-visible one, but I ignored that, as it could have a secondary piece of hardware kicking in (such as the hard drive). Idle results were taken after quitting Doom 3 and observing the wattage number at which the system seemed to "rest" at.
| Idle | Load |
7800 GTX | 128W | 244W |
6800 GT O/Ced to 400/1.0 | 130W | 233W |
It seems that NVIDIA's claims of lower power consumption simple do not match up with results as measured by a power meter. The 7800 GTX actually drew 2W less than the 6800GT O/Ced while idling, but consumed 11W more under load. This is a purely subjective observation that I'm about to share, but I have to be honest and say that after an hour or two of benchmarking both the 6800 GT at Ultra speeds, and the 7800 GTX at stock speeds, the 7800 GTX seemed to be putting out a bit less heat than the 6800. I didn't take scientific measurements (more like moving my hand around the vicinity of the video card on in an open test bed), but my observations were consistent.
** Editor's note ** - If Intel and AMD's methodology of speed binning parts for frequency (hence lower voltage) extends to NVIDIA's methodology, the overclocked 6800GT should be sucking up more juice than a higher binned Ultra which means that in this scenario, the power draw of the 6800 GT is overstated assuming voltages on the 6800 Ultra and 6800 GT board are not equal
Antialiasing Comparison
Screenshots alone will not do this feature justice. Playing Far Cry at 1024x768 with 4x AA, 8x AF, and supersampled T-AA, the difference (and impact) of the final rendered product was astounding. No longer were the 2D sprites a thorn in my butt -- the game looked absolutely amazing. As the architectures of GPUs develop further, supersampled T-AA will become a "free" feature, much like regular anti-aliasing is now among the higher-end cards.
Regular, Multisample Transparency AA, Supersample Transparency AA