"If you don't want to code with varying hardware in mind then you should be coding for consoles"
The whole point of having an API standard like directx 9, is so that varying hardware can use it exactly the same way.
Then you don't have to go about coding stuff like this:
If GPU = <standard> then draw pretty pictures. elsif GPU = <GeforceFx5900> draw pretty pictures without fog endif
This is the kind of shit that make developers angry if they do not agree to the nice thick pay packet that accompanies this wool over the eyes practise. In the name of "optimisation"? Bah.
Especially if the guilty party sticks a "Full DirectX 9 compliance" sticker on their half-baked product and sells it to all us bright-eyed and bushy tailed sheep standing in line with those dollars burning a hole in our pockets.
Standards mean we can reliably compare different products against one another. Breaking those standards cause what we have here today. Confusion. We have no valid way of comparing products because not everyone is adhering to the standard.
I'll quote Mr. Carmack:
"when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins"
So when using the standard, R300 wins by double the speed. Vendor-specific path (What? the "optimisation" route? Graphics quality toned down, removal of special effects like fog?) then suddenly the NV30 wins. Sorry mate. I do believe in an equal playing field. The standard apples-to-apples comparison means a lot more than the apples-oranges comparison.
BTW my current gpu is a Nvidia GF3. You get a cookie if you can guess my next GPU. This comment was edited on Sep 18, 10:01.