DOOM 3 & DX9

With all the discussion lately about how Half-Life 2 will run on ATI hardware compared with NVIDIA accelerators, BonusWeb.cz shot off an email to id Software's John Carmack to ask how DOOM 3 and other games are likely to be impacted by the difference between the two graphics platforms running DirectX 9. Here's how he responded:
Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
View : : :
118 Replies. 6 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  ] Older
118.
 
RE: good thread
Oct 22, 2003, 21:11
RE: good thread Oct 22, 2003, 21:11
Oct 22, 2003, 21:11
 
right folks,first of all i think you should all chill out getting so speculative over games that aren't even on the market yet. There not out yet so we dont know how they will perform exactly except that HL2 is tuned for ATI and D3 is tuned for FX.

Theres some very good techy jargon in some of these posts,dont get me wrong you know your stuff but i also work in the hardware market.I work in the testing/returns dept in a pc hardware retailer (not some shitty company like pc world etc....I wont say who coz im not giving free adverts to my employer!) and so have to test new products before they come out (through visits by manufacturer reps and such),all current products and also see what things fail on a regular basis and so see a good view on products.I feel sorry for any of you with a gigabyte radeon 9700 pro - nice card, shame about that fan.

Anyway, some peoples ranting about both companies makes me wonder what planet your on, im slightly disappointed by my 128m gainward 5900 at the moment - probabably the same as most owners of a top end FX card at the moment. Only slightly though, because despite all the talk of 'cheats' (which ARE also good optimizations - check the sky in UT2K with the 50s compared to a radeon) the card still kicks ass even if it doesnt live up to expectation.

Were all getting shafted by Valve/ATI and ID/NVIDIA only because there will be clear hardware borders, theres nothing we can do about it....but at least well be able to play 1 of the 2 very well. And anyway if you have a 9800 or a 5900 and a nice system,what are u moaning about? You have a kick ass system and youll have probably got a new card by the time these games come out judging by all the put-backs!Stop crying

117.
 
Will mine work?
Sep 23, 2003, 17:04
Will mine work? Sep 23, 2003, 17:04
Sep 23, 2003, 17:04
 
I have a diamond stealth video card on a 468 66mhz (overclocked to 68mhz). Is this dx9?

116.
 
Re: mondovoodootunabonecurse
Sep 22, 2003, 17:59
Re: mondovoodootunabonecurse Sep 22, 2003, 17:59
Sep 22, 2003, 17:59
 
I agree Guano. I will only be angry with nVidia if this card doesn't run HL2 well. I dont care if its a few frames under ATI's comparable card. I just don't want it chugging and choking while im playing, and I dont want to have to turn off all the cool effects. I put the money out for this 5900 so I wouldn't have to.

--
He cut the possum's face off then cut around the eye socket. In the center of the belt buckle, where the possum's eye would be, he has placed a small piece of wood from his old '52 Ford's home made railroad tie bumper. Damn, he misses that truck.
115.
 
Re: mondovoodootunabonecurse
Sep 22, 2003, 15:34
Re: mondovoodootunabonecurse Sep 22, 2003, 15:34
Sep 22, 2003, 15:34
 
If there is so many old school people posting here how many rememmber ATI and there great commitment to bad DRIVERs. I have been screwed every time I bought an ATI card all the way from my first Mach64 to the last RADEON I used that caused both Win2K and XP to reboot out of the blue for no reason. As an IT tech I spent lots of hours trouble shooting this and many other problems with ATI cards, drivers and OS incompatability. I know they have gotten better but honestly how can I feel good about giving them another dollar of my money when I still feel like I never got my money worth the first half a dozen time I choose to use them. WITH 3DFX AND NVIDIA I HAVE ALWAYS FELT I GOT MY MONEYS WORTH AND SOME. NO wonder ATI had to change there tone to stay in bussiness. My loyalty is not bought with a few benchmarks.

Sean

114.
 
Re: SOUR GRAPES
Sep 21, 2003, 18:14
Re: SOUR GRAPES Sep 21, 2003, 18:14
Sep 21, 2003, 18:14
 

That may be true, but can you explain to me why the Doom 3 alpha ran much faster on a Radeon 9800 than a GFFX 5900 and how the 9800 has all sorts of problems running the closer-to-final game? Is it not because Carmack is still sending nVidia updated versions of the unfinished game so they can "optimise" (disable features for better preformance) their drivers while leaving ATI out in the cold?

BTW, the GF/Detonator drivers, after about v. 29, won't let you play the Doom 3 alpha. Lot of back-scratching going on, methinks.

113.
 
Re: SOUR GRAPES
Sep 21, 2003, 08:27
Re: SOUR GRAPES Sep 21, 2003, 08:27
Sep 21, 2003, 08:27
 
Doom is based on the DX7 specifications, the GF5900 is the ultimate DX7 card, thats why it runs Doom3 faster. This isn't a joke. Carmack isn't holding back the Radeons performance. You can get educated at www.beyond3d.com
Just search through thier forums.

112.
 
Re: SOUR GRAPES
Sep 20, 2003, 07:12
Re: SOUR GRAPES Sep 20, 2003, 07:12
Sep 20, 2003, 07:12
 
Imagine the performance the Radeons would achieve if Carmack wasn't still pissed with ATI for leaking the D3 alpha and he was optimising the game for Radeons as well as GFs?

Carmack? Isn't it time you grew up and wrote the game for both sets of card owners? People aren't going to buy a $500 GF card, just to play your game, and to the detriment of ATI.

If all other games play faster with an ATI card then a lot of players may just forgo your game and enjoy what is playable with what they have or what will play with a lot of games. Remember that D3's only real competition in the near future will be HL2, and Valve have already stated that nVidia cards are going to suck by comparison.

111.
 
Re: mondovoodootunabonecurse
Sep 20, 2003, 06:04
Re: mondovoodootunabonecurse Sep 20, 2003, 06:04
Sep 20, 2003, 06:04
 
Being an old schooler, I purchased every card 3dfx made up until the geforce came out

Man I miss 3dfx. I also would only buy 3dfx cards when they were in business. The Vodoos cards always gave me GREAT frame rates. I damn near defecated on myself when I heard they went under.

110.
 
Re: mondovoodootunabonecurse
Sep 19, 2003, 15:23
Re: mondovoodootunabonecurse Sep 19, 2003, 15:23
Sep 19, 2003, 15:23
 
I have an FX 5900 and I can honestly say that it has worked great so far. I've been playing Tron 2.0 with everything cranked up graphicly and haven't seen a framerate or visual problem yet.

Of course I realize Tron 2.0 is no Half Life 2, "graphically speaking", so I am still concerned about the performance I will see in HL2. Being an old schooler, I purchased every card 3dfx made up until the geforce came out. Ever since I have used Nvidia cards. I can honestly say that if I get screwed over on this $400 purchase, I will never buy from nVidia again no matter what products they come out with in the future. Not only that, but I vow to take my Mossberg 590 Tactical Shotgun and my mini DV camera out to the desert here in Vegas where I will record myself vaporizing this 5900. I will then use my video editing skills to produce a nice little presentation on the whole deal, and why I will never puchase from nVidia again.

Sure, I will do this at the risk of looking like a whiner, but it will still be fun, and Im betting it will be passed around the gaming community like wildfire.....thus getting my message out there. I will be out $400, and I will have to spend another $400 or so for an ATI, but it will be well worth taking a stand.

I'm tired of getting boffed by nearly every company out there. There is more greed, corruption, and pure stupdity then any other time in history. People can't get my god damned fast food order right, nor can they send me WORKING computer parts. sick of it I'm sick of it sasklaru eor avweropiuv be;orpu eporu n[eawipour nP[WRU VW[AUEIPR AW[UPEIR[APE[POVU[WOAUR[OI R
[par
][po b]rowerp[oti

--
He cut the possum's face off then cut around the eye socket. In the center of the belt buckle, where the possum's eye would be, he has placed a small piece of wood from his old '52 Ford's home made railroad tie bumper. Damn, he misses that truck.
109.
 
Re: mondovoodootunabonecurse
Sep 19, 2003, 13:22
Re: mondovoodootunabonecurse Sep 19, 2003, 13:22
Sep 19, 2003, 13:22
 
I too, have been a loyal Nvidia Geforce card buyer over the past 5 years. One reason was the great driver support. I had used ATI's cards in the past. But had found that driver support was lacking. Great hardware bad drivers. Now I am finding that Nvidia's last two Detonator driver updates have done more to screw up my gaming experience than ever before. I have a Ti4600 in a WinXP Pro machine. I have had to turn my system back after upgrading my detnonator drivers twice now! Or suffer the slowed down frame rate and other anomolies that have shown up after each driver upgrade (48 and 49). I am now considering ATI again. Especially after seeing this latest controversy surrounding DX9. Nvidia is slipping as far as I am concerned.

108.
 
Re: Some Thoughts
Sep 18, 2003, 22:04
Re: Some Thoughts Sep 18, 2003, 22:04
Sep 18, 2003, 22:04
 
Wasn't the whole big thing about the nv30 was the new FX coding thing, whatever it was called. They were trying to create their own custom graphics and features, instead of focusing on what we already have, DX9. I guess that whole thing was thrown out now, last I ever heard of it being put in a game was that gunmetal or something, with things like motion blurs on planes. Serves them right trying to create a special graphics language for their own cards.

107.
 
Great article related to DX9 ATI vs Nvid
Sep 18, 2003, 21:56
Great article related to DX9 ATI vs Nvid Sep 18, 2003, 21:56
Sep 18, 2003, 21:56
 
http://www.tomshardware.com/graphic/20030918/index.html

Tom's has a great article that talks about the basics of DX9 and has a link to an article to Nvidias "cheating" and talks about the new 50 det drivers and problems of not rendering frames correctly. Its a good read.

106.
 
Re: Some Thoughts
Sep 18, 2003, 21:41
Re: Some Thoughts Sep 18, 2003, 21:41
Sep 18, 2003, 21:41
 
A couple people replied that I'm wrong about ATI cards having a custom path in Doom3. But like I said, it's not a big deal. In fact it makes even more of a case for ATI, I think.

I think that while it's still possible, we should allow nvidia (or whoever) to continue to use custom paths for benchmarking purposes, providing the image quality doesn't suffer. Obviously I'm talking theoretically here, as nvidia's "cheats" were more or less exactly that, but as far as Doom3 goes, if it doesn't matter and the game developer is willing to put up with it, then so be it.

At the same time, I totally agree with HBringer that this shit needs to end. I think that this is the last generation of cards that should need custom paths, and the second-to-last generation that can't deal with graphical High-Level Languages seemlessly without massive compiler-level customation.

I hope that ALL the graphics card players will make cards that conform to the API's, that Microsoft helps them make it possible with DX9, and that the OGL people get thier shit together and standardize the best extensions.

Zoner made a couple execellent posts there. I can understand Microsoft's actions up to now, but I think that graphics has become strange to the point where DX10 has to be done differently. Advancements need to be dictated by the users (game programmers) rather then by whichever video card group is making the next X-Box.
--
I like the Quake 3 Arena.
105.
 
Re: Hardware
Sep 18, 2003, 21:08
Re: Hardware Sep 18, 2003, 21:08
Sep 18, 2003, 21:08
 
This is a message board, so it can only be conjecture, intuition, and observation Microsoft could have very easily provided hardware specific shader variants in DX8 for properly exposing GF3/4 hardware, in addition to the dumbed down interface. Instead only the Radeon 8500 really gets its own (PS 1.4) complete with hardware specfic hacks (phase markers) while the GF3/4 did not. Also, comparing the features you receive with hardware specific vendor extensions in OpenGL with what you can do in DirectX clearly shows the discrepancies.

Nearly every new feature in DX9 is only available on the Radeon. While both cards can run in higher precision, only the Radeon has actual FP16 and FP32 (64 and 128 bit) floating point texture formats. Also, only the Radeon has seperate alpha-blend operations for the alpha channel for framebuffer blends. Luckily both have the new two-sided stencil which games like DOOM3 use heavily for faster shadow volume calculations.

For developing GeForceFX they recommend using shorter shaders (less fillrate burn), FP16 precision instead of full FP32, or code the pixel shader in PS 1.4 (which ironically can be made to share with Radeon 8500 series without much fuss).

After all this, the last remaining bit I find interesting is the GeForceFX is actually more expressive with PS 2.0. Instead of making multiple shader versions like they did for DX8, they made a single version, but in classic Microsoft style added some capability flags you can check to see if you are allowed to do something 'cool' with the shader script you wouldn't normally be able to do. The FX has all of these flags, while the Radeon has none. The fact that there is more than just one flag (There are 5), actually bothers me as a developer, because that probably means theres going to be some oddball third card from another vendor out there that only has some of these flags. End results: nobody is going to checks any of the flags and instead just run 'vanilla' 2.0 all the time for all hardware, possibly with all 'floats' replaced with 'half' in the script source to help out the ole GeForceFX.

104.
 
Re: Hardware
Sep 18, 2003, 19:06
Re: Hardware Sep 18, 2003, 19:06
Sep 18, 2003, 19:06
 
Zoner -

Where do you get your information about Microsoft designing the spec "around ATI"??

And just because they came up with a new chip architecture doesn't mean they did anything bad or wrong - its a designer's perogative to come up with their own designs and plans. The standards are there to ensure that their designs still "play nice" with everyone else... What they do "under the hood" is up to them.

Take care,

--Noel "HB" Wade

103.
 
Re: Hardware
Sep 18, 2003, 17:58
Re: Hardware Sep 18, 2003, 17:58
Sep 18, 2003, 17:58
 
A lot of this problem is perception as well as Microsoft to some extent. The GeForce line has steadily evolved from the TNT, each card significantly more capable than the previous. This is also very true of the FX line over the GeForce 3/4. The 'problem' that came up is ATI came up with a new architecture and essentially changed the rules by more than usual in the graphics industry.


The other half of this is Microsoft. When the DX8 and PixelShader 1.1 (GF3), 1.3 (GF4), 1.4 (Radeon 8500), came out, the features exposed from a GeForce3 & 4 were probably only around 75% of the capability of the card. The shading language was dumbed down for compatibility with other cards (Matrox Parhelia, SIS Xabre, Trident BladeXP, etc) and commonality with Radeon 8500 etc). I'm sure NVIDIA was not happy about that. Coding in OpenGL got you 100% since they wrote their own custom extensions.


Now DX9 comes out, MS totaly designs PS 2.0 around the Radeon 9xxx series. I'm certain NVIDIA was burned by MS a bit by the PS 2.0 spec, moreso than they were in DX8 with PS 1.1 and 1.3. Why this happened is a huge speculative mess, but it did happen. End result: developers must spend some time tuning for NVIDIA for a change, instead of some other card. We're pretty much down to 4 which is a hell of a lot better than around 15 a few years ago. The games that come out are going to run fine on all these cards, because the developers get the pleasure of making that happen.


The good news in all this is all the cards are great at vertex shader performance. The pixel shaders are only a problem since this is where you burn all of your fillrate, which limits your resoloution and ability to run with AA (not that you'll really be able to run with AA on future games, since the developers like burning all the fillrate they can get). The 'reduced' precision mode of the FX is still a whole hell of a lot better than the 8 or 9 bit you got from the previous generation of cards.

102.
 
"Standards"
Sep 18, 2003, 17:27
"Standards" Sep 18, 2003, 17:27
Sep 18, 2003, 17:27
 
Again, to clarify the situation:

The idea behind standards like DirectX and Pixel-Shader specs is to provide two things:

1) Provide a standard set of programming functions and interfaces for programmers to use

2) Provide a standard set of features that all hardware should implement completely, if it wants to claim that it "meets the spec". Ideally, the features should all act / look the same - though performance will probably differ between products.

The idea behind Microsoft having these standards, is that it can use its market position, money, and influence to coerce hardware-makers to conform to the spec a LOT better than trying to get them to do so voluntarily. OpenGL is based around committees and consortiums of manufacturers & developers. As such, it takes longer for revisions in the spec to be approved; and it is up to each individual to implement and "conform to" the spec (and look at how many years its been, and how many chips & drivers STILL don't implement all of the older OpenGL stuff properly). Having a monopolistic power like MS drive development may not be the ideal solution - but it gets results; and for gamers its a "win".

According to what we've seen, nVidia's card does NOT meet the PS 2.0 spec of using FP24 (24 bit) precision. Therefore, a true apples-to-apples comparison of DirectX9 / PS 2.0 features leaves nVidia behind; because it must deal with the "doubled-up 16 bit mode" that it has to run to meet/exceed the spec'ed 24 bit mode. ATi runs this mode natively; and doesn't suffer any performance penalties as a result.

So using the STANDARD as a measuring-tool, ATi steps out ahead of nVidia. Can nVidia beat ATi? Certainly, under certain circumstances: When not using DX9, PS2.0 shaders, or anything higher than 16bit precision for effects - nVidia can still out-perform ATi.

The thing is, as games get more complex, developers are going to have less and less time to devote to "hand coding" alternatives and special code-paths. The "standards" are becoming more popular and more important all the time; and increases in game-complexity will only drive this further along.

Look at how much automation and specialization is being built into art & architecture tools for games these days... Soon, some of those automated / "assistance" aspects are going to need to be incorportated into programming environments, to enable games to be developed on a reasonable time-scale. Engine-licensing was a "first step" in this direction - automation via the use of other people's existing code. However, this can be awkward; and the learning time for someone else's engine eats into the time-benefits of licensing it in the first place. Furthermore, you are constrained to all the limits and compromises someone else made; and they usually made them without ANY idea of what you want to do with their code.

I forsee eventually having integrated development environments that largely automate the basic processes of coding a graphics engine; and the increasing adoption & conformity to "standards" will help this as well - as the compilers and development tools will be MUCH more feasible if they can create "vanilla" code (automatically or with minimal direction) that works for everyone (and since its defined by you, it doesn't suffer the problems of being "someone else's code" like engine licensing). The dominant paradigm will then shift to programming as a means to "define the rules" of the game (physics, interaction, movement, victory conditions, etc); and you will see a "split" in the game-programmer role... Part of these people will end up working on the "developemnt environment" and "tools" - pushing the latest and greatest visuals & physics routines, and optimizing the automated aspects of the development tools. The others will become more like "game designers who code", in a sense that their programming will have much more to do with gameplay, and less with hardware access, memory-management, etc. Art will still be the biggest bottleneck in games development; although the tools for that are getting better - however increases in detail/fidelity are still outpacing the development of most art/architecture tools.

Wow, talk about a rambling topic-diversion!

Take care,

--Noel "HB" Wade


101.
 
Re: the truth... is out there....
Sep 18, 2003, 17:09
Re: the truth... is out there.... Sep 18, 2003, 17:09
Sep 18, 2003, 17:09
 
No matter how you approached it the initial 3Dmark03 results told us precisely what has been discovered today. Namely that the FX series was strongest in GT1 which was the largely DX7 style rendering, it could have been stronger there because the FX does multi-texturing two pass stuff well and that was one of nVidia's complaints - namely that the test was only a single texture one for the most part.

As you increase the level of shader useage the FX's performance dropped off. When you got to GT4 the 5800 displayed performance equivalent to a 9500 (which was ATi's medium performance card of the time.). That was reflected in the scoring with the 5800 Ultra getting around 2600 3Dmarks and the 9700 Pro scoring around 4300. Wanna guess what a 9500 scored? About 2300 as I recall.

Remember it was the rectification of this quite large performance gap along with the noticeable image quality drop in 3Dmark03 that kicked a lot of the investigation off into precisely what drivers are doing. We have had a pretty clear indication with 3Dmark03, code creatures, Shadermark and the like all telling the same story - shader performance was lacklustre in the extreme on the NV3x series.


100.
 
No subject
Sep 18, 2003, 17:05
No subject Sep 18, 2003, 17:05
Sep 18, 2003, 17:05
 
Also, don't forget that the top ATI cards also have a specific optimized path in Doom3
No they Don't.

The 8500 has a R200 path and thats it.

ATi is forced to run the Default ARB2 code path that was origionally intended for all DX9 classed cards to use. Its the same path that Volari (XGI) and DeltaChrome will use.

Nvidia is the only company with such an extensive specially made rendering path.

Pentium 4 2.4B 533 FSB
I850EMD2
512mb 40ns Rdram
Radeon 9700pro 330/330
Win Xp
The Whales name is Bob.
99.
 
Re: Some Thoughts
Sep 18, 2003, 16:47
99.
Re: Some Thoughts Sep 18, 2003, 16:47
Sep 18, 2003, 16:47
 
Wait, I thought it was a generic path for ATI cards with Doom 3 too, since Nvidia cards can run on the same path but at half speed. If Nvidia didn't have this problem, they'd both be running on the same path, something called ARB2 path, not sure. But if it was a specific path for ATI cards, why would they be able to run Nvidias card with it too then bother to complain it's slower. I think the path ATI is using in D3 is what they meant both cards to run on. But they ended up having to make a nv30 path. If ATI had their own path, wouldn't it be called r300 parth, not that arb2 thing

118 Replies. 6 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  ] Older