DOOM 3 & DX9

With all the discussion lately about how Half-Life 2 will run on ATI hardware compared with NVIDIA accelerators, BonusWeb.cz shot off an email to id Software's John Carmack to ask how DOOM 3 and other games are likely to be impacted by the difference between the two graphics platforms running DirectX 9. Here's how he responded:
Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.
View : : :
118 Replies. 6 pages. Viewing page 3.
Newer [  1  2  3  4  5  6  ] Older
78.
 
Re: The Impact Effect
Sep 18, 2003, 04:25
78.
Re: The Impact Effect Sep 18, 2003, 04:25
Sep 18, 2003, 04:25
 
standards. Sometimes a natural transformation of this nature is inevitable where there is no alternative like in the case of noun->verb ‘to e-mail’, but I'd argue that 'affected' remains the proper verb for formal writing.

I agree entirely with you about "impacted" it is a dreadful construction. I equally disagree with 'to e-mail', what is wrong with to send an email? Or perhaps to write/compose one. I don't letter people - I write them letters (and no I don't promise to 'write you' either).

Mind you I also make telephone calls instead of phoning but that battle is well and truly lost.

pedantically yours
Anvil - from the land of warm beer and mad cattle.
77.
 
Re: The Impact Effect
Sep 18, 2003, 04:07
77.
Re: The Impact Effect Sep 18, 2003, 04:07
Sep 18, 2003, 04:07
 
Being the quirky type

Yes.. yes you are

76.
 
Re: The Impact Effect
Sep 18, 2003, 04:06
76.
Re: The Impact Effect Sep 18, 2003, 04:06
Sep 18, 2003, 04:06
 
[OT English point]
Sorry, but the lit person in me has to disagree with you there mate... 'Impacted' might be acceptable in colloquial or hasty speech when you’re trying to mask repetition or embellish the sound, but really I think it is a prime case of a noun that has been turned into a verb by haphazardous journalism and the general decline in English standards. Sometimes a natural transformation of this nature is inevitable where there is no alternative like in the case of noun->verb ‘to e-mail’, but I'd argue that 'affected' remains the proper verb for formal writing.

This comment was edited on Sep 18, 04:12.
75.
 
Re: The Impact Effect
Sep 18, 2003, 03:37
75.
Re: The Impact Effect Sep 18, 2003, 03:37
Sep 18, 2003, 03:37
 
PR = Public Relations.

I wonder if id are looking to weave their www.ua-corp.com website in to the game. Now that would be cool. Maybe you could find logging consoles in the levels and upload your status for the world to see or something.
.
.
.
Just wondering. Nothing to do with graphics cards. Sorry.

This comment was edited on Sep 18, 03:47.
74.
 
Re: The Impact Effect
Sep 18, 2003, 03:37
74.
Re: The Impact Effect Sep 18, 2003, 03:37
Sep 18, 2003, 03:37
 
Being the quirky type, I have to take up the gauntlet on this one (not that there isn't a lot of bad english around these days!):

Actually, "impacted" is the better word to use here, since we're talking about a detrimental performance difference - and ,when not used as a verb - like an asteroid hitting something, "impacted" carries more of a negative connotation than "affected". Think about it - when the dentist tells you your wisdom teeth are trapped below the gums, he calls them "impacted"; not "affected". "Affected" has more of a connotation of "change" or a relative comparison between multiple things. This was a directed comment about the detrimental performance of a particular object - hence "impacted".

;-P

--Noel "HB" Wade
Devil's-Advocate-In-Training


73.
 
Re: The Impact Effect
Sep 18, 2003, 03:33
73.
Re: The Impact Effect Sep 18, 2003, 03:33
Sep 18, 2003, 03:33
 
What does PR mean, I'm guesing something with research, Product Research?

I think you'll like this:
http://www.fanta.dk/showmovie.asp?mid=3A888102-89FC-4D0B-BDA1-C06206345949

If you want more, check out this thread:
http://rage3d.com/board/showthread.php?s=&threadid=33711119


This comment was edited on Sep 18, 03:39.
72.
 
The Impact Effect
Sep 18, 2003, 03:13
72.
The Impact Effect Sep 18, 2003, 03:13
Sep 18, 2003, 03:13
 
Why does everyone now write "impacted" instead of "affected"? Did someone pass a law declaring the latter not "impactful" enough for everyday use?

Sorry, just a quirky pet peeve of mine.

71.
 
No subject
Sep 18, 2003, 02:16
71.
No subject Sep 18, 2003, 02:16
Sep 18, 2003, 02:16
 
Does anyone know whom the Design Engineer is at NVidia for the last 2 GPU's NV30 and NV35? So we can have someone to BLAME this crap and fruad on. Or is it more the EVIL PR department?

Scorpio Slasher: ... What about you boy, what do hate?
Marcus: ... Bullies. Tiny d*ck egotists who hurt people for no reason, make people lock their doors at night. People who make general existence worse, people like you.
Avatar 1858
70.
 
Re: The vidcard market
Sep 18, 2003, 00:27
70.
Re: The vidcard market Sep 18, 2003, 00:27
Sep 18, 2003, 00:27
 
dependent not so much upon actual quality as upon preceived quality

Ah, but is the quality level you perceive not the actual quality level for you, and thus not the only one that matters?



Creston


Avatar 15604
69.
 
The vidcard market
Sep 18, 2003, 00:16
69.
The vidcard market Sep 18, 2003, 00:16
Sep 18, 2003, 00:16
 
Ya know, folks, reading all this, it occurrs to me that the success or failure of a GPU corp like NVIDIA, ATI, or (once upon a time) 3dfx is dependent not so much upon actual quality as upon preceived quality.
The internal workings of these things have become so complex that one really can't keep up with all the advances unless you have either (A) too much time, or (B) a job working with this sort of thing in some capacity. (And I say this from the standpoint of being darned good at both programing and VLSI design, if I do say so myself.)

So is it really any wonder that ATI gave Valve a fat sack of cash? And that Valve subsequently said that the new NVIDIA cards sucked? Or that NVIDIA was investigated by the SEC (Although, I must confess, I am ignorant as too the details of that.) Or that GPU companies have a habit of engaging in quite undignified displays of one-upsmanship?

I think we should take all of this was a grain of salt. It doesn't matter, at least to me, whose top-of-the-line card is slightly faster, because spending twice as much to get 10% more frames per second that you monitor cannot display and your eyes cannot see, is futile.

What *I* care about is what's going to give me decent performance on the current crop of games, for the least amount of my hard-earned dollars. When I notice that the new titles are starting to chug, I read few reviews, and buy whatever seems to fit the bill at the time.

Perhaps we ought to be grateful that this game of king-of-the-video-card-hill is pushing the amount of resources that game companies can draw on to make their stuff look great, but, all the same, it's not something that most of us need to worry about unless we enjoy doing so.

This comment was edited on Sep 18, 00:16.
68.
 
PCI Express
Sep 18, 2003, 00:14
68.
PCI Express Sep 18, 2003, 00:14
Sep 18, 2003, 00:14
 
AGP out,PCI Express in?
I've been hearing that future cards will be based around this interface.
Makes upgrading choices even more complicated then they already are now.....

67.
 
Re: the truth... is out there....
Sep 18, 2003, 00:10
67.
Re: the truth... is out there.... Sep 18, 2003, 00:10
Sep 18, 2003, 00:10
 
'I remember reading from Carmack or yadda yadda yadda that either the 5900 or the 9800 is the cards to own to run the new games. WTF did this change? At what point in the developer updates did nVidia just all of a sudden blow so hard in their oppinion?'

Nothing has really changed. Hell, Carmack in his .plan from ages ago clearly indicated that FX cards performed full precision ARB2 rendering/shaders at half the speed of ATi solutions.

Part of the problem is people are assuming that the D3 engine is indicative of future performance with the DX9 level feature-set. This is not the case. The fact is that D3 isn't shader dependent like HL2 with DX9. Rather it is dependent on incredible bandwidth, multitexturing fill-rate, fast stencil and fixed function dot-3 shading - things that NV3x is actually very good at. (Check out Beyond3D forums for more good info on this). Furthermore, the D3 performance numbers are even more misleading for DX9 because they are in Open GL (Nvidia's strength) and Carmack has specifically coded the game with a custom NV3x path and FP16 shaders so that the register problem is not telling...

If you're look for someone to blame though you might want to start looking at review sites who use 3dmark2k3, Gunmental et al (basically PS 1.4 benchmarks) like they are really DX9 PS 2.0 benchmarks and giving the NV3x high marks for scoring 300 as opposed to 270 FPS in Q3 without even looking at image quality comparisons in driver revisions.

So many supposedly decent hardware sites were implicated in this, and rather than fess up to their erroneous conclusions they tried to insinuate that Valve and ATi had some conspiracy going on - it was pathetic.

This comment was edited on Sep 18, 00:27.
66.
 
Re: the truth... is out there....
Sep 18, 2003, 00:06
66.
Re: the truth... is out there.... Sep 18, 2003, 00:06
Sep 18, 2003, 00:06
 
So, when's the Radeon 9900 coming out? I have a 9800 on order that I'll use until the 9900 comes out (if it's truly much better). Most of the games I'm playing now though run better on my GF-FX 5900 Ultra.

The latest 3.7 drivers for my 9700 at work are finally working right. Before then I had issues with the 9700 and was cursing myself for wasting money on it, but I'm glad to see they finally have gotten at least this phase right. Goodness knows Nvidia's been kicking their ass for the past several years. We'll have to see if ATI can hold onto it, or if they'll fade like AMD did with their Athlons vs the P4.

This comment was edited on Sep 18, 00:07.
65.
 
Re: the truth... is out there....
Sep 18, 2003, 00:00
65.
Re: the truth... is out there.... Sep 18, 2003, 00:00
Sep 18, 2003, 00:00
 
I remember reading from Carmack or yadda yadda yadda that either the 5900 or the 9800 is the cards to own to run the new games. WTF did this change? At what point in the developer updates did nVidia just all of a sudden blow so hard in their oppinion?

I remember Carmack talking about how the TNT 2 was the card to get to run the Quake 3 Arena. My brother went out and bought the card before the game came out. So when Q3 is finally released we come home excited as hell and he fires up Q3. He got worse frames from the TNT 2 than I got from my vodoo 2 that was 2 years older at least. So I agree all these developers and reviews are saying such and such has a great card and it the saving grace and then come game time the cards do not perform. So for everyone thinking of upgrading for a specific game.. wait until the game comes out and you can get some hard numbers on a cards performance in that game.

64.
 
Re: 26 WRONG
Sep 17, 2003, 23:52
64.
Re: 26 WRONG Sep 17, 2003, 23:52
Sep 17, 2003, 23:52
 
Care to verify your source? Of that being the Nv3x designed by former 3dfx engineers and that of the Nv4x being designed by the people behind Geforce4?

IIRC 3dfx was already bought over way before the Geforce4 was released right?

Then what are those 3dfx engineers doing over at Nvidia all these while before the FX series?

Twiddling their thumbs?

This comment was edited on Sep 17, 23:55.
63.
 
Re: 26 WRONG
Sep 17, 2003, 23:42
63.
Re: 26 WRONG Sep 17, 2003, 23:42
Sep 17, 2003, 23:42
 
No. That is not why 3dfx died. 3dfx did this just fine with their early Voodoo cards, and extremely successfully with the V3. What happened to 3dfx was a monster called the TnT2 Ultra, and its much bigger, asskicking brother called the GeForce 256.

It never was a contest, and it was a fucking demolition when comparing the V5 to the GF.

Thats why 3dfx died.


You also forgot about not needing 32-bit color and more RAM on the videocard. When I first saw 32-bit color I was totally blown away in Quake III. My SLI Voodoo 2 looked like shit compared to the TNT2 Ultra

62.
 
Re: 26 WRONG
Sep 17, 2003, 23:38
62.
Re: 26 WRONG Sep 17, 2003, 23:38
Sep 17, 2003, 23:38
 
Sounds like the same will happen to Nvidia.

I doubt that. They have a very good chipset for AMD and a the only soundstorm that can do DDS properly. Even my Audigy 2 can't do internal decoding. Nvidia isn't as dumb as 3dfx, granted they made the mistake of using 3dfx engineers in making the NV3X series. From what I hear the NV4X series will be the same team that worked on the GeForce 4 series. Nvidia has too much money in their pockets to die out. Just like Intel will always be around no matter how better the AMD CPUs may be compaired to clock by clock basis.

61.
 
the truth... is out there....
Sep 17, 2003, 23:09
61.
the truth... is out there.... Sep 17, 2003, 23:09
Sep 17, 2003, 23:09
 
" I spent my hard earned money on a card that became obsolete before it can even do what it was meant to do."

Join the club dude, this is always the case. A card comes be it ATI OR nVidia product, and the games meant for these cards come about 1 to 2 years AFTER the card arrives. But as well, so do 3 more vid cards from each manufacturer.

"I guess the perception of "future" means different things to different pple but all those reviews of the NV35 being "future proof" are pure bullshit,unless those same reviewers upgrade whenever a new chipset comes out and dump their "old" cards in the trashcan."

The term "Future Proof" is and has always been nothing more than pure Bullshit. Inevitably we see this term in all nVidias PR for their cards in previews since way back in the Geforce 256 bygone days. I have owned a 256, Geforce ti500, and a ti4600, and lastly my 5900 ultra. So much for future prrof eh?

And yeah these (asshole) reviewers usually get their cards for free. Take 3DGPU, they freaking raved about the 5900s, and then other not so thrilling info arrives about the 5900s, and now the 5900 is no longer the saving grace for nVidia, it is again as bad a whipping boy as the 5800, even worse.

I remember reading from Carmack or yadda yadda yadda that either the 5900 or the 9800 is the cards to own to run the new games. WTF did this change? At what point in the developer updates did nVidia just all of a sudden blow so hard in their oppinion?

Buying a new card is no biggy for me, I have the disposable income set aside for all my hobbies. But, this political crap over which card is best to run whatever games, it is killing my interest in gaming as a hobby.

Not only is the fact that 1 or 2 new games a year is all I have to look forward to because of the dev time required for new games... but gamers cannot trust the card manufacturers anymore to deliver a trustworthy product. Now enter the new card in either manufacturers line every 6 months or so. At a time when gaming seems to be really entering into a new evolution for gamers, the reviewers, and card developers are systematically screwing the gamers left right and center IMHO.





60.
 
Re: No subject
Sep 17, 2003, 23:05
60.
Re: No subject Sep 17, 2003, 23:05
Sep 17, 2003, 23:05
 
Just to clarify all this precision stuff that seems to be confusing people:

Pixel Shader 2.0 spec is 24 bit minimum - with support for up to 128 bits (that technology's a year or two off, at least). This precision is a hardware issue; it goes beyond DirectX vs. OpenGL. Its more evident on DirectX because there are other things that DirectX does that exposes the disparity more than OpenGL; partly because DirectX evolves a LOT faster these days than OpenGL, so it has standard features that take advantage of new hardware a lot sooner (MS makes all the decisions it wants, which is good and bad; but OpenGL is a consortium, which means committees have to debate and compromise over months or years to get new features or code approved as "standard").

Anyways, you can run ATi at the 24bit spec. From what I understand, nVidia either runs at 16bit, OR 32bit (16 "doubled-up"). If you want to run PS 2.0 / DirectX9 stuff, you have to run 24bit or higher. Therefore, the nVidia cards suffer from a performance hit because of their inability to run the 24bit stuff natively. They have to "double-up" the 16 bit functionality and deal with the performance hit that this entails.

What Carmack has said, is that if you code things especially for "lower precision" (i.e. the 16bit mode that nVidia does by default), then the card works just as well as the ATI - but that running in "full precision" (24 or 32 bit mode), the nVidia will suffer a performance penalty.

In a nutshell, Carmack is basically saying that to get equal performance out of the cards, he's had to go out of his way to create a special set of code for the nVidia hardware; but the ATI can run "standard code" at a decent level. The amount of performance disparity isn't specified - but if it was 1 or 2 frames per second, I'd bet money that Carmack wouldn't have taken the time to write a custom graphics routine for it!

Glad I've got a little time to sit back and watch before I update my aging GeForce2!!

Take care,

--Noel "HB" Wade


59.
 
Re: ADOLESCENT CONSOLES
Sep 17, 2003, 22:59
59.
Re: ADOLESCENT CONSOLES Sep 17, 2003, 22:59
Sep 17, 2003, 22:59
 
Heh u can't catch sarcasm very well can you?

118 Replies. 6 pages. Viewing page 3.
Newer [  1  2  3  4  5  6  ] Older