|
The introduction of 512MiB consumer graphics hardware
[Abstract]
ThoughtsAdvertismentThere's obviously a case to be made for 512MiB consumer hardware, but I'm hoping you can see that the case to be made should be limited to certain sections of the graphics m...
[Content] PCDigitalMobileGame
ThoughtsAdvertisment
There's obviously a case to be made for 512MiB consumer hardware, but I'm hoping you can see that the case to be made should be limited to certain sections of the graphics market, sections where the added costs of a board give you actual benefit, rather than none at all. That hopefully tells you that you should shop sensibly when considering one. The advantages will become much more clear in the future, as and when games titles start using rendering methods and art assets that make a 512MiB board a much more compelling choice.
On a high end board like the X800 XL, the increased memory size will more often than not give you a smoother gameplay experience if you're running at high resolutions and with high levels of anti-aliasing applied, compared to what you'll get with the 256MiB board. You therefore have to decide whether the cost for that bigger board is better spent there, rather than on something like an X850 XT PE. When you do so, consider how long you'll keep the board. If it's going to be a significant length of time, the choice makes more sense.
To sum up this little article on 512MiB hardware, before I preview a few SKUs from ATI and NVIDIA properly in the future, the benefits today are there, but you need to make sure you're buying a 512MiB board in the right class, playing games at the right settings and ideally you'll be keeping the board for long enough to enjoy future games that'll regularly use more than a 256MiB memory size.
Overall, and I say this for the average consumer with a decent LCD rather than a very high end CRT, 512MiB hardware is future looking, moreso than a purchase for today. Keep that in mind.
Look out for a full preview of the X800 XL 512MiB, along with a preview of equivalent NVIDIA hardware, in due course.
|
|
|