DOOM 3 & DX9

With all the discussion lately about how Half-Life 2 will run on ATI hardware compared with NVIDIA accelerators, BonusWeb.cz shot off an email to id Software's John Carmack to ask how DOOM 3 and other games are likely to be impacted by the difference between the two graphics platforms running DirectX 9. Here's how he responded:
Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
View : : :
118 Replies. 6 pages. Viewing page 2.
Newer [  1  2  3  4  5  6  ] Older
98.
 
Some Thoughts
Sep 18, 2003, 15:42
98.
Some Thoughts Sep 18, 2003, 15:42
Sep 18, 2003, 15:42
 
To everyone saying that nvidia's problem is it's lack of 24 bit path: stop. We have no evidence of this. It's possible that it's true, but for all we know a mythical 24 bit NV30 would still be slower. In fact, Valve's benchmarks seem to indicate that the NV30 is SLOWER with 16 bit precision than ATI is with 24 bit. So more likely, nvidia's hardware is inferior in other ways.

Also, don't forget that the top ATI cards also have a specific optimized path in Doom3, they're not the default generic path. Not a big deal, but keep it in mind. I believe that in HL2, the ATI cards are using a default DX9 path, while the NV30 has a specific path. Older cards from both companies appear to be using a generic DX8 path.

I think that if an apples to oranges comparison exists, then we should use it. If the NV30 can rock, let it rock. Note though that in HL2, the NV30-specific code is slower then ATI on the DX9.

No one, no one has been more vocal about the need for standard paths then the Carmack. Valve's "hissy fit" about it last week is nothing compared to the years of work JC has put into this noble goal. Tim Sweeny from Unreal has been vocal about it in the past as well, and Monolith implies it's support for pure DX only. I'm very happy to see Valve speak out as well.

So basically, Carmack is saying that the NV30 sucks at shaders, that it's got nothing to do with DX9 or Valve, and that Doom3 isn't shader-intensive so it's not as big a deal for his game.
--
I like the Quake 3 Arena.
97.
 
Re: The vidcard market
Sep 18, 2003, 15:31
97.
Re: The vidcard market Sep 18, 2003, 15:31
Sep 18, 2003, 15:31
 
dependent not so much upon actual quality as upon preceived quality

Ah, but is the quality level you perceive not the actual quality level for you, and thus not the only one that matters?

Well, by "preceieved quality", I kinda meant "hype" , not "what stuff looks like when you buy it and install it", since once you get to that point, you've already spent your money, no?


96.
 
Re: 26 WRONG
Sep 18, 2003, 14:55
96.
Re: 26 WRONG Sep 18, 2003, 14:55
Sep 18, 2003, 14:55
 
Care to verify your source? Of that being the Nv3x designed by former 3dfx engineers and that of the Nv4x being designed by the people behind Geforce4?

IIRC 3dfx was already bought over way before the Geforce4 was released right?

Then what are those 3dfx engineers doing over at Nvidia all these while before the FX series?

Twiddling their thumbs?

They were working on the FX series the whole time. I remember the article that he's talking about.

Nvidia supposedly is alternating two teams (one composed of the former 3dfx engineers, and the other with the original nvidia design team) between chipsets. So when one chipset is being released, the other team is already elbow deep in working on the next chipset. So the plan was that after the Geforce4 design was complete, that team would be working on the sequel to the GeforceFX. And the team that worked on the FX chipset started working on it after moving from 3dfx to Nvidia. I thought that was interesting since some physical design aspects such as size, heat, and power were similar with the Voodoo5(?) and the GeforceFX.

Unfortunately, I don't remember the source of this information (so take my memory of the article with a grain of salt) but the article in question was written when the FX chipset was released.

95.
 
Re: Hrm ...
Sep 18, 2003, 14:00
95.
Re: Hrm ... Sep 18, 2003, 14:00
Sep 18, 2003, 14:00
 
q[Never buy a video card before or right when a graphic / processor intensive game comes out.]

I agree that's the case for most people. But I know a lot of people that still have GeForce 2's and 3's on reasonable systems (1.4+ GHz). Waiting for 4-6 months for them would either seem like an eternity, or ruin the game because of low quality.

In their cases it might be wise to get a new video card, just not THE top of the line. Something like a 9700 would cut it, and probably only cost 200-300 USD.

"Space. It seems to go on and on forever. But then you get to the end and a gorilla starts throwing barrels at you."
-Fry, Futurama
94.
 
Re: the truth... is out there....
Sep 18, 2003, 13:58
94.
Re: the truth... is out there.... Sep 18, 2003, 13:58
Sep 18, 2003, 13:58
 
I'll quote Mr. Carmack:

"when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins"

quote source: http://www17.tomshardware.com/column/20030219/3dmark2003-01.html

So when using the standard, R300 wins by double the speed.
Vendor-specific path (What? the "optimisation" route? Graphics quality toned down, removal of special effects like fog?) then suddenly the NV30 wins. Sorry mate. I do believe in an equal playing field. The standard apples-to-apples comparison means a lot more than the apples-oranges comparison.
(hope that italic worked)

Reading the .plan this quote is from I gather the meaning to be this:
There are a number of paths each card can take. Each path has its advantages (speed or quality) and it's disadvantages (poor speed or poor quality).

What Mr. Carmack was refering to here is the ARB2 path. Both cards render this path with FULL FEATURES. Unfortunately due to the nVidia's card using full 32bit precision, it is much slower using this path. The nVidia cards also have a NV30 path that has all the same features, but is much faster (uses 16bit or 24bit precision? not sure which). The Bruhaha (sp?) over missing fog and the like in the .50 drivers (!beta!) is that they impliment one feature at a time fully as opposed to implimenting each feature partially. So each incremental release has more complete features than the last and no partial half-assed features. So NV30 (and arb2?) path running on .50 drivers will be missing features. When the final .50 drivers come out, these features will be implimented.

I imagine nVidia might try to impliment support for using lower precisions on their ARB path, but I wouldn't count on it.

So. if you can get your nVidia card to run on the NV30 path all the time, you should run faster with only marginal quality loss over the ARB2 path on the same card.

Frankly, I'm still stuck on my TNT2 and will be for a while (You can't imagine how slow some games run on this card. BUT THEY DO RUN DAMMIT.!!!)

93.
 
Hrm ...
Sep 18, 2003, 13:07
93.
Hrm ... Sep 18, 2003, 13:07
Sep 18, 2003, 13:07
 
Never buy a video card before or right when a graphic / processor intensive game comes out. Always wait about 4 - 6 months until newer models run things at a decent FPS. Otherwise you're just a guinea pig.

This comment was edited on Sep 18, 13:08.
--
"So what does Quake 3 give you? It's quite simple: the utter refinement of deathmatch, and the ultimate multiplayer shooter experience - nothing more, nothing less. " -- Dennis "Thresh" Fong
92.
 
Re: Phew
Sep 18, 2003, 10:55
92.
Re: Phew Sep 18, 2003, 10:55
Sep 18, 2003, 10:55
 
I bet HL2 wont meet the release date BECAUSE OF THE FUCKING Geforce FX, because they have to re-write effects specialy for that peaace of shit card, while all others work in standart path.


91.
 
Phew
Sep 18, 2003, 10:25
91.
Phew Sep 18, 2003, 10:25
Sep 18, 2003, 10:25
 
I'll be honest and say that my understanding of graphics and DX9 and ARB etc is severely limited to say the least, which is why I fully appreciate people who DO understand what Carmack et al are talking about, coming here to enlighten us sheep.

I had for awhile been considering the FX5900 Ultra, but after all the nasty things I read below which, as far as I can determine, are nothing but plain honest truth, I think Nvidia has received its last dollars out of my pocket.

As for ATI having shitty drivers, shrug, Nvidia's drivers have never worked well for me anyways, so I don't think I'll notice much of a difference.

Thanks for all the info guys, it's well appreciated.

Creston


Avatar 15604
90.
 
Re: the truth... is out there....
Sep 18, 2003, 09:38
90.
Re: the truth... is out there.... Sep 18, 2003, 09:38
Sep 18, 2003, 09:38
 
"If you don't want to code with varying hardware in mind then you should be coding for consoles"

The whole point of having an API standard like directx 9, is so that varying hardware can use it exactly the same way.

Then you don't have to go about coding stuff like this:

If GPU = <standard> then
draw pretty pictures.
elsif GPU = <GeforceFx5900>
draw pretty pictures without fog
endif

This is the kind of shit that make developers angry if they do not agree to the nice thick pay packet that accompanies this wool over the eyes practise. In the name of "optimisation"? Bah.

Especially if the guilty party sticks a "Full DirectX 9 compliance" sticker on their half-baked product and sells it to all us bright-eyed and bushy tailed sheep standing in line with those dollars burning a hole in our pockets.

Standards mean we can reliably compare different products against one another. Breaking those standards cause what we have here today. Confusion. We have no valid way of comparing products because not everyone is adhering to the standard.


I'll quote Mr. Carmack:

"when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins"

quote source: http://www17.tomshardware.com/column/20030219/3dmark2003-01.html

So when using the standard, R300 wins by double the speed.
Vendor-specific path (What? the "optimisation" route? Graphics quality toned down, removal of special effects like fog?) then suddenly the NV30 wins. Sorry mate. I do believe in an equal playing field. The standard apples-to-apples comparison means a lot more than the apples-oranges comparison.

BTW my current gpu is a Nvidia GF3. You get a cookie if you can guess my next GPU.
This comment was edited on Sep 18, 10:01.
89.
 
No subject
Sep 18, 2003, 09:34
89.
No subject Sep 18, 2003, 09:34
Sep 18, 2003, 09:34
 
The ATI Radeon R3x0 chips only support 24bit PS2.0 shaders.

The Nvidia FX NV3x chips support FX12(12bit), FP16(16bit) and FP32(32bit) PS2.0 shaders.

The DX9.0 spec states that PS2.0 support must be at least 24bit, so in order for the NV3x to meet the spec then it must run in FP32 mode. So NV telling developers to use FP16 and FX12 on DX9.0 games are effectively cheating the spec again...but getting a third party to do it for them!

When it comes to OpenGL however it's really down to the developers to decide what they want to do. In Carmacks case he's obviously looked at the problem and decided that a drop to FX12 precision for his shaders doesn't produce a hugely inferior image, and coded an NV specific path. Any NV3x owners should be grateful for this as you'll get faster FPS than an ATi at virtually no IQ loss.

Sadly once I saw my £400 NV card being trounced on DX9.0 by a £150 ATi card, I finally saw my last gasps of NV loyalty evaporate and I am now the proud owner of an ATi Radeon 9800 Pro 256...heres to HL2!

Finally big thanks to Valve for confirming all the "NV3x has piss-poor PS2.0 performance" rumours - someone big had to step up and publicly humiliate NV otherwise the charade would have continued.

88.
 
Re: the truth... is out there....
Sep 18, 2003, 09:21
88.
Re: the truth... is out there.... Sep 18, 2003, 09:21
Sep 18, 2003, 09:21
 
"Sorry, stuff like this annoys me. If you don't want to code with varying hardware in mind then you should be coding for consoles. Writing games for the PC is all about catering to the wide variety of hardware that's available and making sure that you're supporting as much as you can whilst trying to get the best performance out of it."

It is not the fact that non standard optimisations are annoying or costly that is the real issue (altough that will certainly affect smaller game developer for the worse, delay big games and mean bigger driver downloads). The main problem is that the NV3x is advertised and sold as a fully compliant DX9 capable 'future proof' card when in fact it has crippled ARB2/DX9 next-gen performance. It really doesn't matter how good you are at coding - you just cannot run proper full precision shaders and use the kind of features that depend on them (ie. HDR Lighting) with NV3x hardware at a good FPS. It not that Carmack and everyone else can code whereas Valve are just lazy. This is not apples to apples - D3 simply does not use 1,200 version 2.0 Pixel Shaders and it is Open GL not D3D... It is based primarily around powerhouse DX7/8 hardware not DX9 stuff like HL2... A custom mixed mode can be used in places and _pp lookup can be used to lower IQ for performance but you cannot magically make up for the NV3x's poor shader abilities...

This comment was edited on Sep 18, 09:28.
87.
 
Re: the truth... is out there....
Sep 18, 2003, 09:12
nin
87.
Re: the truth... is out there.... Sep 18, 2003, 09:12
Sep 18, 2003, 09:12
nin
 
I've been a Nvidia buyer since the TNT, and in the past have never been a fan of ATI. But after the Valve and now ids comments, I'm having serious doubts about throwing money at a FX based card. The performance doesn't appear to be as good, I question how well the card will perform in future DX9 based games, and I'm tired of the driver cheating BS - both companies have done this, but (AFAIK) ATI stopped, while Nvidia keeps it up. I want to be able to determine the image quality on my games, NOT the person who wrote the drivers.

I've been waiting all this time for the FX, and now I just can't see the logic of buying it.

Supporter of the "A happy fredster is a muted fredster" fanclub.

http://www.davidbowie.com
86.
 
Re: the truth... is out there....
Sep 18, 2003, 09:05
86.
Re: the truth... is out there.... Sep 18, 2003, 09:05
Sep 18, 2003, 09:05
 
>This is exactly the reason why Valve(Gabe) is pissed at >Nvidia. He is sick of writing special (non-standard) code >so that Nvidia's latest dx8 card, I mean "dx9" (ahem) can >run his engine properly. ATI does not require this.

Sorry, stuff like this annoys me. If you don't want to code with varying hardware in mind then you should be coding for consoles. Writing games for the PC is all about catering to the wide variety of hardware that's available and making sure that you're supporting as much as you can whilst trying to get the best performance out of it.

85.
 
Re: the truth... is out there....
Sep 18, 2003, 08:38
85.
Re: the truth... is out there.... Sep 18, 2003, 08:38
Sep 18, 2003, 08:38
 
'Carmack said the GFX IS good enough for D3, the lower precision is unimportant for the D3 engine. So all the games that will come out in the next 3 years using the Doom3 engine will work perfectly on the GFFx'

There is no way I would assume that... The NV3x's non-ABR2 rendering path in D3 will not automatically create performance in all future incarnations of the D3 engine. There is no way in hell that all future licensees of the D3 engine will use exactly the same non full precision fragmentation shaders as in the D3 game and never extent the functionality further. At best there will be a lower IQ compatibility mode for the NV3x or FX owner will just have to put up with the same frame-rate hit as we've seen in HL2... It is not the end of the world and driver _pp replacements can make up for some of it - but you really are better off with an R3x0.

84.
 
Re: the truth... is out there....
Sep 18, 2003, 06:40
84.
Re: the truth... is out there.... Sep 18, 2003, 06:40
Sep 18, 2003, 06:40
 
It's not as bad as you think it is. Carmack said the GFX IS good enough for D3, the lower precision is unimportant for the D3 engine. So all the games that will come out in the next 3 years using the Doom3 engine will work perfectly on the GFFx (but who keeps their Graphics that long anyway ?)
I think the important sentence he said was: "but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec"
See?
When games will be designed with DX9 as MINIMUM SPECS... Geez how long will that take ? Doom 3 was designed with DX6 as minimum spec right? (or is it Dx7?)

Of course if all you want is cutting edge hardware all the time... you'll be changing your graphics card in 12 months anyway. So, even if you still got screwed a bit by nVidia... It isn't so bad. Next time you'll be more careful. I bought my 9800pro the week the 5900 came out. I read many reviews and wasn't really impressed and saw that it was still a 2-slot card and I thought that was bullshit design from the start -- even though they lowered the noise level from the 5800 -- since the 9800pro is way shorter than a GF4/4600 already and can be cooled with just a small cooler/HS... and it's quieter than a GF4 too!

83.
 
Re: 26 WRONG
Sep 18, 2003, 06:31
83.
Re: 26 WRONG Sep 18, 2003, 06:31
Sep 18, 2003, 06:31
 
Even my Audigy 2 can't do internal decoding.

You mean AC3 encoding (not decoding because all Audigy cards can do that).

But you are right. Even if nvidia would have to close its graphic compartment (which never will happen), they still have a lot other business to do in other market sectors which will keep them alive...

nvidia will not go down because of this fiasco.

82.
 
Re: HL2 & D3: hard to compare
Sep 18, 2003, 06:27
82.
Re: HL2 & D3: hard to compare Sep 18, 2003, 06:27
Sep 18, 2003, 06:27
 
Instead of crying you should sell your FX's before it's to late and the prices fall to much and get yourself a Raddy

Well, selling one's FXes would be an option if there was an alternative on the market. The problem is, there is no other option. Matrox cards are too slow and offer no decent driver support, ATI cards still need to improve a lot in the driver compartment (although they did get a lot better but for professional and Linux use they are still worthless). That leaves us with the new XGI cards but I doubt that they will be able to keep up with nvidia or ATI...

So, what else to buy beside nvidia?

81.
 
Re: No subject
Sep 18, 2003, 05:37
81.
Re: No subject Sep 18, 2003, 05:37
Sep 18, 2003, 05:37
 
Great post there HBringer. Very informative. I give you a Score:5 Insightful and a bonus point for not being an "anonymous coward"

I quote:
"In a nutshell, Carmack is basically saying that to get equal performance out of the cards, he's had to go out of his way to create a special set of code for the nVidia hardware"

This is exactly the reason why Valve(Gabe) is pissed at Nvidia. He is sick of writing special (non-standard) code so that Nvidia's latest dx8 card, I mean "dx9" (ahem) can run his engine properly. ATI does not require this.

However, whether the D3 engine is "optimised" in favor of Nvidia (OpenGL) or not, we still have to consider this engine will be used in many games in the future. The same goes for the HL2 engine.

Most probably:
ATI will run HL2 faster/quality than Nvidia.
but Nvidia will run D3 faster/quality than ATI.

This is an assumption and the reason I'm waiting to see with my very own eyes.

Looking at what API (DirectX/OpenGL) and more importantly what engines (HL2 or Doom3 - don't forget Unreal) the games of the next 3 years will be using would probably be a more accurate measure to decide which card to buy if money is not an issue.

Strength is irrelevant, resistance is futile. I am being assimulated by the ATI BORG... MEEP MEEP.
This comment was edited on Sep 18, 06:05.
80.
 
Re: the truth... is out there....
Sep 18, 2003, 05:31
80.
Re: the truth... is out there.... Sep 18, 2003, 05:31
Sep 18, 2003, 05:31
 
'Actually before nVidia hacked their 40 series of drivers 3Dmark03 was telling us precisely the same thing as HL2's numbers do. Why else do you think nVidia went so hard after Futuremark to discredit their benchmark?'

IMHO - only in a partial sense. Many of the sites that did use 3dmark to evaluate possible future performance weren't even approaching it in the right manner - ie. using the total 3dmark scores rather than the individual test results (eg. the nature test etc)

Regardless, even without the driver cheats and looking at specific tests AKAIK none of them produced results where there was an incredible 50+% performance gap between the R3x0 and NV3x with the 9600pro being the equivilent of a 5900u in shader terms.

Now we might not be able to point the finger too broadly given that there weren't any non-synthetic DX9 benchmarks around, but we can certainly blame specific sites for recommending the NV3x as clearly better alternative to the R3x0 and generally future proof card when there were synthetic indicators to the contrary as well as the general knowledge available about the ARB2/DX specs to any properly savvy reader.

This comment was edited on Sep 18, 05:34.
79.
 
Re: the truth... is out there....
Sep 18, 2003, 04:39
79.
Re: the truth... is out there.... Sep 18, 2003, 04:39
Sep 18, 2003, 04:39
 
<i>If you're look for someone to blame though you might want to start looking at review sites who use 3dmark2k3, Gunmental et al (basically PS 1.4 benchmarks) like they are really DX9 PS 2.0 benchmarks and giving the NV3x high marks for scoring 300 as opposed to 270 FPS in Q3 without even looking at image quality comparisons in driver revisions. </i>

Actually before nVidia hacked their 40 series of drivers 3Dmark03 was telling us precisely the same thing as HL2's numbers do. Why else do you think nVidia went so hard after Futuremark to discredit their benchmark?

118 Replies. 6 pages. Viewing page 2.
Newer [  1  2  3  4  5  6  ] Older