Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
Customize
User Settings
Styles:
LAN Parties
Upcoming one-time events:

Regularly scheduled events

DOOM 3 & DX9

With all the discussion lately about how Half-Life 2 will run on ATI hardware compared with NVIDIA accelerators, BonusWeb.cz shot off an email to id Software's John Carmack to ask how DOOM 3 and other games are likely to be impacted by the difference between the two graphics platforms running DirectX 9. Here's how he responded:

Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec.

View
118 Replies. 6 pages. Viewing page 4.
< Newer [ 1 2 3 4 5 6 ] Older >

58. No subject Sep 17, 2003, 22:57 OneNightStand
 
I think Nvidia should offer a trade in rebate for their NV40(Well if they screw this chip up then i don't know what to say).

But then again i am just dreaming.

I've tried to sell my FX5900 but the response wasn't too good,maybe because i clearly stated that the FX5900 isn't likely going to perform well in DX9.

 
Reply Quote Edit Delete Report
 
57. Re: ADOLESCENT CONSOLES Sep 17, 2003, 22:52 Hellbinder
 
I've been following this whole saga closely because i own a FX5900,and yes as Gabe said,i am pissed off. Not because ATI has better FPS,Nvidia is a cold faced cheater or any of that fanboy shit but because the bottom line is that spent my hard earned money on a card that became obsolete before it can even do what it was meant to do.
Guess what. Earth to man with 5900.

Ati does have better FPS and Nvidia damn well is a *Cold Face Cheater*. None of which has anything to do with *Fanboy shit*.




Pentium 4 2.4B 533 FSB
I850EMD2
512mb 40ns Rdram
Radeon 9700pro 330/330
Win Xp
 
The Whales name is Bob.
Reply Quote Edit Delete Report
 
56. Re: No subject Sep 17, 2003, 22:48 Hellbinder
 
nVidia can do FP16, and FP32. ATi can only do FP24, the minimum spec for DX9. Sure, you can praise nVidia for trying to raise the bar in doing FP32. But then you have to wonder why they put in FP16. Especially since they write their drivers to operate in FP16 and not FP32.
I got news for you gents. Nvidia will lose to ATi with FP16 as well. They do not get a significant speed advantage by Droping to FP16.

Nvidia gets a speed advantage by rewriting entire prtions of Dx9 or high Percision OpenGL tp PS1.4/FX12. Carmack is Droping 90% of Nvidias code to FX12 or Dx8 level code.




Pentium 4 2.4B 533 FSB
I850EMD2
512mb 40ns Rdram
Radeon 9700pro 330/330
Win Xp
 
The Whales name is Bob.
Reply Quote Edit Delete Report
 
55. Re: No subject Sep 17, 2003, 22:40 Maskirovka
 



This comment was edited on Sep 17, 22:43.
 
Reply Quote Edit Delete Report
 
54. Re: HL2 & D3: hard to compare Sep 17, 2003, 22:32 Enahs
 
Instead of crying you should sell your FX's before it's to late and the prices fall to much and get yourself a Raddy
I'd buy em, $20...hell, I will give you $50 (depending on model)
Still better then I got, ATM

"Plaque is a figment of the liberal media and the dental industry to scare you into buying useless appliances and paste. Now Iíve read the arguments on both side, and I havenít found any evidence yet to support the need to brush your teeth. Ever."
 
Avatar 15513
 
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Reply Quote Edit Delete Report
 
53. Re: HL2 & D3: hard to compare Sep 17, 2003, 22:31 ReDeeMeR
 
Instead of crying you should sell your FX's before it's to late and the prices fall to much and get yourself a Raddy

 
Reply Quote Edit Delete Report
 
52. HL2 & D3: hard to compare Sep 17, 2003, 22:27 MrJonesPY
 
Well, HL2 uses DX9 and D3 uses OpenGL.
Also, ATI uses FP24 and Nvidia FP32 when running at Full Precision.
So, how can we compare this all?

Should Carmack update his .plan? I still don't belive that he replied to the email.

This comment was edited on Sep 17, 22:46.
 
Reply Quote Edit Delete Report
 
51. No subject Sep 17, 2003, 22:25 OneNightStand
 
What Carmack means is that Doom III doesn't give a rat's arse about precision,but wasn't he the one who kept saying that 32 bit precision shaders are the "future"?

Damn these people are trying to confuse.

Ahem isn't the FX series supposed to be DX9 level hardware?! This might be seriously funny if i was not owning one.

Anyway,not all developers are willing or able to waste time or resources to hold Nvidia's hand like ID or Valve and those stuck with the current FX series are basically holding on to a dying piece of hardware that can't even do what it was meant to do properly,meaning that it doesn't follow the minimum DX9 spec of 24 bit shader precision.

I've been following this whole saga closely because i own a FX5900,and yes as Gabe said,i am pissed off. Not because ATI has better FPS,Nvidia is a cold faced cheater or any of that fanboy shit but because the bottom line is that spent my hard earned money on a card that became obsolete before it can even do what it was meant to do.

I guess the perception of "future" means different things to different pple but all those reviews of the NV35 being "future proof" are pure bullshit,unless those same reviewers upgrade whenever a new chipset comes out and dump their "old" cards in the trashcan.

Well "Buyers beware" as the saying goes. I for one will be approaching all hardware purchases with greater caution in the future,regardless of how many "kickass" or "9/10" awards the product in question got.

 
Reply Quote Edit Delete Report
 
50. Re: ??? Sep 17, 2003, 22:07 TimothyB
 
He is not forcing ATI's to run at higher precision, I believe it just can NOT do lower precision(someone correct me if I am wrong?). Or, if it does lower, it is under 16bit, maybe 12 or 8 bit precision, which does have a noticable difference for what he is doing?(indented/italic)

Sorry, what went I meant was is that they are not forcing it, but by what he said about the precision not being needed it makes it sound like leaving it at default high quality is unnecessary for this game. Would you run a game with 12x AA if 6x AA had no visual difference, would you?

I understand the reason for the special path for nvidia to make it run, but if it has the same visual quality as the default DX9 path the Radeon uses, but using lower precision, and possibly outpeforms the Radeon, it makes it look like the GFX beats the Radeon, even though the Radeon possibly would beat it if it had it's on special low path.

I just hope no one decides there purchase choice based on Doom benchmarks with the GFX wining with the low path.

 
Reply Quote Edit Delete Report
 
49. nVidia needs to be honest Sep 17, 2003, 22:04 OCed_5150
 
With as much news about the problems that seem to plague nVidia's Geforce FX Direct X 9 specs, why hasn't nVidia made a honest attempt to make a statement about these?

The longer they hide behind unanswered questions, the more it's going to hurt their reputation.

Does anyone remember when Intel released the Pentium with a floating point bug and how long it took before Intel admitted to the problem?

We're going though this again. But this time with nVidia. And Im surprised by thier "It's not our doing" as of late.

< Im not the only one that shares this opinion and I think it's HIGH time for nVidia to address these issues before its too late. If it's not already. >

This comment was edited on Sep 17, 22:20.
 
http://www.doom3xtreme.com
Reply Quote Edit Delete Report
 
48. Re: mondovoodootunabonecurse Sep 17, 2003, 22:02 Enahs
 
Brand loyalty is silly.
Except when you work for that company, and you want that company to make money, so you keep your job!

That is just the opposite of silly!


"Plaque is a figment of the liberal media and the dental industry to scare you into buying useless appliances and paste. Now Iíve read the arguments on both side, and I havenít found any evidence yet to support the need to brush your teeth. Ever."
 
Avatar 15513
 
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Reply Quote Edit Delete Report
 
47. Re: mondovoodootunabonecurse Sep 17, 2003, 21:59 mag
 
I was reffering mostly to the mindset of, "Maaaan, I don't want to have to buy an ATI card! They used to be bad, not like my baby nVidia..." as being weird. I understand that ATI used to suck, and I understand not buying anything from them at that point.
But complaining when one company has a better product than another is silly. Brand loyalty is silly. You go with what will do what you want for what money you want to spend and for the amount of trouble you're willing to put up with.

Not that I'm accusing anyone in particular of being this way. I've just seen it on various boards.

This comment was edited on Sep 17, 22:00.
 
Reply Quote Edit Delete Report
 
46. Re: mondovoodootunabonecurse Sep 17, 2003, 21:58 richcz3
 
Man.. so true, so true. And yet, how can any serious gamer not be considering ATI at this point? I have a ti4006, but I'm already planning on buying the ATI/HL2 bundle when I see it.

I knew nVidia was in trouble starting with the release of the FX5800. It was late, hot as lava and expensive and poor performer to boot.
I never thought I would buy ATI. Since last year, I Have replaced 4 GeForce4600 with ATI's. Man...how fortunes changed. This FX series is a mondoVoodootunabonecurse for nVidia.
I still have a 3dfx 5500 in its original box (Halo featured on the back of the box).

richcz3

This comment was edited on Sep 17, 22:07.
 
Reply Quote Edit Delete Report
 
45. Re: 26 WRONG Sep 17, 2003, 21:57 Ryan
 
3dfx died because they were late to the table with their last couple of cards.

I also wonder if the propritary nature of glide had a hand in their demise. If so, that could spell trouble for nvidia if they don't fix this issue, since the FX requires an extra code path that the other cards don't (ok, the other cards are currently just ATI, but presumably _someone_ else will make a 'nextgen' card...).

 
Reply Quote Edit Delete Report
 
44. Re: ??? Sep 17, 2003, 21:56 Enahs
 
But again, why slow down a card with higher precision that's not needed and give lower precision to another card with no loss in quality. It just seems unfair that a card could perfom even better with no visible cost
He is not forcing ATI's to run at higher precision, I believe it just can NOT do lower precision(someone correct me if I am wrong?). Or, if it does lower, it is under 16bit, maybe 12 or 8 bit precision, which does have a noticable difference for what he is doing?

"Plaque is a figment of the liberal media and the dental industry to scare you into buying useless appliances and paste. Now Iíve read the arguments on both side, and I havenít found any evidence yet to support the need to brush your teeth. Ever."
 
Avatar 15513
 
I am free of all prejudice. I hate everyone equally.
- W. C. Fields
Reply Quote Edit Delete Report
 
43. ARGUE! Sep 17, 2003, 21:56 Bunko
 
NO! 3dfx died because a rabid spider monkey threw a banana at the Voodoo 5 and caused it to suck. A similar incident with a wombat caused total annihilation. End of story.

There was only one catch and that was Catch-22
 
Who knows but that, on the lower frequencies, I speak for you?
http://citizenb.com/ - Now at v1.1
Reply Quote Edit Delete Report
 
42. ??? Sep 17, 2003, 21:52 TimothyB
 
I haven't read the 35 replies yet, but just the quoted paragraph for the topic.

If John says precision doesn't really matter to Doom, why the hell does Nvidia get to have lower precision for better speed at no cost and ATI is kept running at higher precision that's not totally needed for the game. That way us ATI players get that extra peformance too, not just Nvidia. It just seems unlogical for ATI to be running at higher precision for no point while Nvidia gets the lower with no visual costs to make their cards look better than they are. I wonder how good ATI's cards would be then against it with a similar path to what Nvidia will run.

Though, don't get me wrong, if there was a visual difference, then hell no, keep it high, like with HL2, where there is a difference usign the mixed path for nvidias card there.

But again, why slow down a card with higher precision that's not needed and give lower precision to another card with no loss in quality. It just seems unfair that a card could perfom even better with no visible cost. Oh well.

Does anyone get the point I'm trying to say? And when he says the precision used on nvidia doesn't matter, that's where I get no visible cost from. Correct me if I'm wrong.

This comment was edited on Sep 17, 21:53.
 
Reply Quote Edit Delete Report
 
41. Re: mondovoodootunabonecurse Sep 17, 2003, 21:51 Pineapple Ferguson
 
You guys are weird. Swearing off buying something good because a previous product was inferior. You guys do remember the suck that was the Riva 128, right? And the Voodoo Rush? We'd all be holding onto our stock boards and waiting for Bitboys' card if we refused to buy a company's product because they made a stinker in the past.

You obviously never had an ATI card within the last 10 years. Their drivers where HORRIBLE and their driver support was even worse. I'm not talking about just one little bad experience, I'm talking about a consistent, long-term problem. To me, ATI was synonymous with "shitty card with bad drivers". Oh, and the reason I'm so familiar with this is because my company purchased Compaq computers with ATI cards. So every time I got a new computer, which was roughly whenever I wanted, that's what I got.
Point is, I did swear off buying them but now they've apparently replaced the corpse that was in charge of developing drivers with real people who are good at their jobs.
So, what's so wierd about it? ATI is top-dog now, Nvidia is falling. I'm not swearing off Nvidia, but I'm 90% certain that my next card will be an ATi.

This comment was edited on Sep 17, 21:52.
 
Avatar 15151
 
Reply Quote Edit Delete Report
 
40. Re: wakey wakey! Sep 17, 2003, 21:50 Ryan
 
doesn't Carmack program in openGL rather than DirectX? I thought he was against Microsoft engineering for specific graphical platforms.

It's not the API (directx vs opengl) it's the lower level shader language that the FX is having issues with. Regardless of if you use opengl or dx9, you still have to use the shader language to get the 'cool, new effects' that the cards can (or should...) be capable of.

re: fp16/32: (This is my understanding of the problem, I'm sure it's over simplified, but not as over simplied as "the FX sucks" or "ATI rocks") The basic problem is that the fx doesnt have as many registers (basically buckets to keep your temporary work while computing the graphics) as the dx9 shader compiler expects. So, as a work-around, you can 'double' the number of registers available by using 16bit shaders, but at a loss of quality (and at apparently it's more complicated than that, you have to basically write all your shaders twice, once for ps2.0 and once for the nv35 (ps1.4?).

 
Reply Quote Edit Delete Report
 
39. Re: 26 WRONG Sep 17, 2003, 21:45 mag
 
No. That is not why 3dfx died. 3dfx did this just fine with their early Voodoo cards, and extremely successfully with the V3. What happened to 3dfx was a monster called the TnT2 Ultra, and its much bigger, asskicking brother called the GeForce 256.

3dfx died because they were late to the table with their last couple of cards. If the voodoo5 had come out on time it would've been plenty of competition.

This comment was edited on Sep 17, 21:46.
 
Reply Quote Edit Delete Report
 
118 Replies. 6 pages. Viewing page 4.
< Newer [ 1 2 3 4 5 6 ] Older >


footer

Blue's News logo