Okay, I'll bite, since I'm not doing much else this afternoon.
All quotes are from Aclecius:
Maybe Slashdot knows what there talking about doesnt seem like.
I think you are saying that Slashdot is normally informative, but this time they are not. Weird, I have the completely opposite opinion of this article, and the resulting discussion (which is always more informative than the article itself).
The 5900 box says its DirectX9, all the sales people say (at least till 2 days ago , that NVida is the only DirectX9 card.
If this is true why werent the drivers ready for Directx9 games.
I would guess because they felt the market demand wasn't there. How many pure dx9 games are there right now? And how many are on your shelf?
Now, if they advertized the FX series as a (the only
according to your statement) 100% dx9 compatible card, then you may have a point, but I don't actually know that they do, nor do I understand why people don't do some basic research before dropping a wad of cash on a video card.
But I believe that the card should have focused solely on dx9 performance, I mean- who actually uses that OpenGL crap anyway (<- this is sarcasm). With all seriousness, I think nVidia made a bad choice (and not their first) with development, and it is biting them in the ass.
Why is Valve saying it is taking 5 times long to get NVida code paths to work.
Because they are creating a mixed dx8/9 path for that card. Perhaps if they asked nVidia a year ago, with the intention of working with them, the required shader optimization would be in place and it wouldn't be an issue.
Instead we have two issues now, where Valve apparently blindsides nVidia with "those FX cards won't work with our game" which is followed by nVidia saying, "okay, you don't want to make it work, so we will". It may be right that nVidia should fix this (I think so), but the way this is brought out to the gaming community makes Valve look like they are playing favorites and acting like children.
Valve is dealing with facts, not theroreital ideas, plus Valve specificly said the 50.xx drivers disable fog in there levels to get a speed boost, how is competeing fairly.
I remember the days when fog was used to boost performance...
But it seems clear to me that the issue was the beta nature of the driver. I wonder if Valve mentioned this to nVidia before the demo, or if they bombed 'em with it following their confusion on why the 45.XX drivers were being used.
Finally, while I can't quite say for sure, I'm pretty sure I played UT2003 with my nVidia card using trilinear filtering. I'd try it now, but I have a card from a different vendor...