Post #63 by Zephalephelah:
Okay kwyjibo, I’m going to take you seriously & tell you what’s up. In the 90’s when 32bit Windows took over the scene, people had to upgrade their systems at least every 6 months.
From an earlier comment in the same post I don't get the impression that you're a high-end/bleeding edge gamer, so I have no idea who or what made you upgrade your system every 6 months. I have no idea who these other "people" are either. No one I know has ever put themselves into such an upgrade loop. Perhaps you and these people have money to burn, I really can't say.
I consider myself a mid-range hardware type of guy when it comes to gaming, and have always been able to play current games without problems. My first PC was purchased in 1989 and since then I've upgraded 4 times, for a total of 5 computers. The last two have been the only ones that I've had to perform "update upgrades" (ie video card or RAM). So all things considered my upgrade schedule has been roughly every 3 years. I realize, especially in the last few years, that hardware is improving exponentially and I have no problem upgrading when necessary. However, my card still decent and is still considered a mid-range video card, and looking at video card comparison charts confirm that. What really irks me is when developers but in intentional blocks to prevent things from working, and from my understanding the Unreal engine that Bioshock is using does support the earlier pixel shaders.
Anyways, it was just an observation/comment on my part. My card is still good enough to play everything else on the market so I have no intention of shelling out $200+ for a new card just for one game.