I love that guys that have spent >$500 for a graphics card are claiming there are no tangible benefits to having a faster system.
There's a difference between a superficial benefit and a competitive benefit. A person playing at 2560x1600 doesn't have any real competitive advantage over someone playing at 1280x720. Sure, their game looks sharper but the difference isn't significant enough to really mean anything. The thing that matters most is framerate and any $100 videocard can run any console port at 60 FPS with little effort. More expensive cards can get higher framerates, yes, but once you go over 60, it doesn't really matter in most games.
As mentioned in previous posts, actually turning
down the detail settings sometimes gives players a competitive advantage because they can more easily distinguish opponents from the environment. In this respect, the slight advantage of more expensive hardware is completely negated.
Since you've failed to address any of the rebuttals to your argument, I'll list them again:
1) The vast majority of "high-end" PC games are console ports.
2) A $100 videocard can easily run said console ports at 60 FPS.
3) Anything past 60 FPS isn't really noticeable in most modern games.
4) Any serious PC gamer will have
at least a $100 videocard and will not be playing competitive multiplayer games on a $500 laptop with integrated graphics.
5) Increased resolution and AA (the two most intensive things for videocards) do not offer any meaningful competitive advantage.
6) Playing with low detail settings sometimes offers a competitive advantage, further negating the importance of expensive hardware.
This comment was edited on Dec 30, 2010, 13:29.