Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
I've been following this whole saga closely because i own a FX5900,and yes as Gabe said,i am pissed off. Not because ATI has better FPS,Nvidia is a cold faced cheater or any of that fanboy shit but because the bottom line is that spent my hard earned money on a card that became obsolete before it can even do what it was meant to do.Guess what. Earth to man with 5900.
nVidia can do FP16, and FP32. ATi can only do FP24, the minimum spec for DX9. Sure, you can praise nVidia for trying to raise the bar in doing FP32. But then you have to wonder why they put in FP16. Especially since they write their drivers to operate in FP16 and not FP32.I got news for you gents. Nvidia will lose to ATi with FP16 as well. They do not get a significant speed advantage by Droping to FP16.
Instead of crying you should sell your FX's before it's to late and the prices fall to much and get yourself a RaddyI'd buy em, $20...hell, I will give you $50 (depending on model)
He is not forcing ATI's to run at higher precision, I believe it just can NOT do lower precision(someone correct me if I am wrong?). Or, if it does lower, it is under 16bit, maybe 12 or 8 bit precision, which does have a noticable difference for what he is doing?(indented/italic)
Brand loyalty is silly.Except when you work for that company, and you want that company to make money, so you keep your job!
3dfx died because they were late to the table with their last couple of cards.
But again, why slow down a card with higher precision that's not needed and give lower precision to another card with no loss in quality. It just seems unfair that a card could perfom even better with no visible costHe is not forcing ATI's to run at higher precision, I believe it just can NOT do lower precision(someone correct me if I am wrong?). Or, if it does lower, it is under 16bit, maybe 12 or 8 bit precision, which does have a noticable difference for what he is doing?
doesn't Carmack program in openGL rather than DirectX? I thought he was against Microsoft engineering for specific graphical platforms.
No. That is not why 3dfx died. 3dfx did this just fine with their early Voodoo cards, and extremely successfully with the V3. What happened to 3dfx was a monster called the TnT2 Ultra, and its much bigger, asskicking brother called the GeForce 256.