DOOM 3 Benchmarks Follow-up

Team Radeon has comments from ATI about the recent spate of DOOM 3 benchmarking (story), pointing out the game is not yet available, taking a stab at NVIDIA's precision (in spite of their own "fudging" being questioned by Carmack during the testing), and saying they should have new drivers "available in the coming weeks." There is also a raging Rage3D Discussion on the possibility of an ATI DOOM 3 bundle, supported by some blurry camera phone shots of the associated promotional materials. This rumor is one of those they run through their magic 8-ball in the latest GameSpot Rumor Control/(Mongering) column.
View : : :
137 Replies. 7 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  7  ] Older
137.
 
Re: WaltC & Hellbinder
Jul 27, 2004, 12:27
Re: WaltC & Hellbinder Jul 27, 2004, 12:27
Jul 27, 2004, 12:27
 
Waltc while you do point out some very relevent facts I have to question one bit. If ATi has admitted to using a different algorith as compared to "true" trilinear, why can't we have an option to shut it off just like the Nvidia boards. I understand that it is a smart algorith that turns on and off accordingly, however the user should be given the option over the hardware at anytime. Also, it can tell when colored mip map tests are taking place, and then uses the true trilinear mode, it is not representive of the actual in game performance and image quality.

Please explain to me why ATi has opted not to give the end user control over their hardware. I am not saying Nvidia handled the situation well, but at least now the user has control over their equipment.


The answer to this question is so apparent, at least to me, that I'm surprised it gets asked so much...;) Most people who ask the question, I think, just haven't taken a moment to think it through from a practical standpoint.

Let's say that you are running the nVidia drivers in their default state with trilinear opts ON, and you're playing a game through a scene in which the trilinear opts become apparent and you begin to see mipmap boundaries, or other instances of rendering IQ anomalies, which you suspect are related to the trilinear optimizations turned ON in your drivers. If you wish to replay the game scene with nVidia's tri opts turned OFF, what is there to do except to (a) shut down the game, (b) go to the desktop and open the nVidia control panel, (c) check the trilinear opts OFF check box, (d) apply the change, (e)reboot the game and (f) replay the scene? Conversely, if you want to turn the trilinear opts back ON later in the game you have to repeat the process in reverse to turn them back ON. Rinse & repeat, endlessly.

Pretty impractical, it seems to me, and that's the whole problem with nVidia's "dumb" (manual) ON/OFF approach to trilinear opts. At the best the scheme is inconvenient and distracting to game play.

And that's why, in my opinion, ATi opted to go with the automatic approach to programmed switching of the trilinear opts on and & off during all 3d games. In most all 3d games I'm aware of, there are scenes within those games where trilinear opts may be used with no observable detriment to trilinear rendering IQ, and scenes where the optimization is not enough and standard trilinear needs to be used in order to maintain the desired IQ. ATi's on-off automatic, "intelligent" method provides an end user with that level of support--nVidia's manual method does not, as I've described.

People are, I think, not seeing the issue clearly because they are thinking of it in terms of either running a game with them off or running a game with them on (like vsync or FSAA or AF, etc.) But the case for trilinear optimizations is different by nature, because the goal of them is that they be indistiguishable in IQ from standard trilinear filtering while providing additional performance. Since practically all 3d games contain scenes in which trilinear optimization is appropriate and may be used without IQ degradation, and scenes where this is not the case and standard trilinear is required to maintain IQ, then the only method for implementing trilinear opts *that is of value to the end user* is the method ATi has employed of the automatic on-off, self-switching trilinear optimization. So, there's a very persuasive practical reason why ATi's doing what it's doing, imo.

By comparison, what's nVidia doing here? When nV first began using trilinear opts they built it into their drivers last year with *no* manual turn-off switch (never mind that nVidia never told a soul about it in advance.) For awhile, in UT2K3 exclusively, there was *no way* to get trilinear filtering on detail textures at all--either from the nVidia control panel *or* through the game itself--you always got the optimization in the case of UT2K3.

Later, nVidia expanded this trilinear optimization to all 3d games, again, sans any OFF switch anywhere. Still later yet, nVidia built in an OFF switch for the optimizations in its control panel, but if you'll remember the first stab or so at doing that, nV reported the OFF switch was broken and simply didn't function. That indicated to me that it wasn't as easy for nVidia to actually turn off these optimizations in its drivers as the on/off cpanel check box implies it should be (evidently it was easier for nVidia to put the tic box into the Cpanel GUI than it was to make it functional)--else the OFF switch would have worked in all of its drivers from the start, and nVidia would have had an OFF switch built in from day one. The thing to remember is that originally nV's tri-opts driver implementation was never meant to be turned off, either by manual control from the cpanel, or from within a game itself.

Presumably, now, all of that has been fixed at last, and the tri-opts off switch in the nV Cpanel is functional. Even if true, and the nV tri-opts OFF Cpanel switch does indeed turn all of them off in every 3d game, this still does not alleviate the practical problems in nV's manual on-off approach as I relate above.

Last, I want to reiterate that my complaint about Carmack's presentation was not just that I think he completely mischaracterized what ATi's doing, but my complaint primarily about his remarks is that he didn't even *mention* nVidia's "filtering fudging" (as he puts it), even to the point of saying something like this:

"nVidia's implementing trilinear filter fudging, too, in its drivers by default. But nVidia provides a manual tri-opts OFF switch via its control panel and, while not as eloquent perhaps as ATi's intelligent switching algorithm, it does at least allow the user to turn them all off. In the current D3 build on which this test is based, with the nVidia drivers I've tested, I have personally verified that the OFF switch works in D3 and the nV drivers do not do trilinear optimizing in D3, so long as they are turned off via the control panel before running the game."

Instead of saying anything remotely like this--Carmack is as silent as a tomb concerning even a *mention* of nV's "filter fudging." Doesn't mention it at all.

I found that remarkable, and of course it certainly cannot be because JC doesn't *know* nV's drivers do trilinear optimization by *default*--certainly not...:D He knows it --of that I have absolutely no doubt. So why the silent treatment--especially when he did bother to mention ATi's trilinear opts approach and bothered to characerize it as "fudging"?

Heh...;) To know the answer to that I'd have to be a fly on the wall of JC's mind, and I certainly am not that...;) But I do have a charitable theory as to why he was as silent as the grave on the subject of nVidia's trilinear opts, which is:

[Charitable Theory /ON]

He did personally test the nV-control panel trilinear opts OFF switch for D3 during these tests, and found that it did not disable all of the trilinear optimizations the nV drivers force for D3. He may well have found that it disabled some, but not all of them. I would imagine that next he would have asked nVidia about it, and that nVidia would have given him the predictible answer that, "It's a bug that we'll address in upcoming drivers," and that Carmack simply chose to let it go and say *nothing* about nVidia's trilinear optimizations at all simply to avoid a public confrontation with nVidia over the issue, since he could not be 100% sure himself that a "bug" was not indeed responsible for the behavior he observed.

[Charitable theory /OFF]

Still, even this theory leaves unanswered the central question as far as I'm concerned: why mention ATi's trilinear opts in any capacity whatever if you're not going to mention nV's trilinear optimizations in any capacity? Which brings me to an uncharitable theory:

[Uncharitable theory /ON]
This was a nV-sponsored, TWIMTBP exercise and was never meant to do anything except to appear as an objective test (much like the [H] 2003 Doom 3 Preview), the actual purpose of which was to serve as a marketing platform for nV 3d-cards, and it was known as a foregone conclusion that nV would either win, or be made to win the benchmarks. This would certainly explain why Carmack would mention ATi's trilinear-filtering optimizations in a negative light (as "fudging"), while not mentioning nV's fudging at all, wouldn't it?

I mean, for Carmack to state he didn't see any "egregious cheating" going on certainly doesn't rule out that he detected some non-egregious cheating--but then we'd have to understand what JC means by "egregious cheating" in the first place, which he doesn't define for us. But I have to question how much stock we might put into such a definition, seeing as how the fact that nV drivers by default engage in the same general kind of "filter fudging" as the ATi drivers, but Carmack only notices the ATi optimizations in his remarks pertinent to this event.

[uncharitable theory /off]

People will have to decide it for themselves, certainly.

It is well known that I cannot err--and so, if you should happen across an error in anything I have written you can be absolutely sure that *I* did not write it!...;)
Avatar 16008
136.
 
Re: why is D3 faster on 6800 cards?
Jul 26, 2004, 19:50
Re: why is D3 faster on 6800 cards? Jul 26, 2004, 19:50
Jul 26, 2004, 19:50
 
Uh, it's right there on the front page of the linked benchmarks article on [H].

What's [H]?


Heh, I like messin with you guys sometimes. Aren't those GP components in both the GT and the Ultra?

This space is available for rent
135.
 
Re: why is D3 faster on 6800 cards?
Jul 26, 2004, 17:57
Re: why is D3 faster on 6800 cards? Jul 26, 2004, 17:57
Jul 26, 2004, 17:57
 
High sys reqs bodes well for an engine that will rpobably be around for the next couple of years.

You cannot make anything fool-proof. The fools are too inventive
You cannot make anything fool-proof. The fools are too inventive

GW: Tr Gandhi (Ra), Shiva Sung (Mo), Mangal Pandey (Ne), Rana Pratap Singh (Wa), Boddhi Satwa (Ri), Bhagat Singh (De), Bahadur Shastri (Pa)
Avatar 11944
134.
 
Re: why is D3 faster on 6800 cards?
Jul 26, 2004, 01:16
Re: why is D3 faster on 6800 cards? Jul 26, 2004, 01:16
Jul 26, 2004, 01:16
 
Did you read what Carmack said about overclocking?

Nope. What'd he say?

Uh, it's right there on the front page of the linked benchmarks article on [H]. Something about OCing being unpredictable since Doom 3 utilizes certain GP components that have apparently been lying dormant all this time.

133.
 
Re: why is D3 faster on 6800 cards?
Jul 25, 2004, 22:39
Re: why is D3 faster on 6800 cards? Jul 25, 2004, 22:39
Jul 25, 2004, 22:39
 
Did you read what Carmack said about overclocking?

Nope. What'd he say?

This space is available for rent
132.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 22:32
Re: WaltC & Hellbinder Jul 25, 2004, 22:32
Jul 25, 2004, 22:32
 
Oh, no. I'm below average. Sure enough.

131.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 20:15
nin
Re: WaltC & Hellbinder Jul 25, 2004, 20:15
Jul 25, 2004, 20:15
nin
 
(Hoping Doom 3 will come out soon so that these stupid flame wars end

Are you kidding? Then it'll be the "HL2 Flame Wars"!!!

See you in Hell! http://www.doom3.com/
130.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 20:02
Re: WaltC & Hellbinder Jul 25, 2004, 20:02
Jul 25, 2004, 20:02
 
I always love these ATI vs. Geforce flamewars. People can get worked up about something so minor

Agreed ..... can't we all just get along

(Hoping Doom 3 will come out soon so that these stupid flame wars end )

You cannot make anything fool-proof. The fools are too inventive
You cannot make anything fool-proof. The fools are too inventive

GW: Tr Gandhi (Ra), Shiva Sung (Mo), Mangal Pandey (Ne), Rana Pratap Singh (Wa), Boddhi Satwa (Ri), Bhagat Singh (De), Bahadur Shastri (Pa)
Avatar 11944
129.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 17:48
Re: WaltC & Hellbinder Jul 25, 2004, 17:48
Jul 25, 2004, 17:48
 
I always love these ATI vs. Geforce flamewars. People can get worked up about something so minor.

http://service.futuremark.com/compare?2k3=2808236 - My Computer
http://users.ign.com/collection/PilotCman - My Games
http://users.ign.com/wishlist/PilotCman - What I want
128.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 16:54
Re: WaltC & Hellbinder Jul 25, 2004, 16:54
Jul 25, 2004, 16:54
 
actually it can't really tell when colored mip map testers are taking place, it can only tell that the colored mip maps are so different that true trilinar filtering is needed. it does the same thing without colored mip maps, hence the reason there is no need for a manual option.

127.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 16:37
Re: WaltC & Hellbinder Jul 25, 2004, 16:37
Jul 25, 2004, 16:37
 
ATi's tri opts turn themselves ON and OFF *automatically* in the current drivers (and don't exist at all in earlier Catalyst versions), in a programmed fashion, and *always* turn themselves OFF in the presence of colored mipmap tests, and other similar software conditions, which are the tests reviewers use to check trilinear filtering image quality. Therefore, if you do nothing apart from installing nVidia's drivers, nVx trilinear optimizations are turned ON by default (and the only way to turn them off is to check the appropriate check box in the nVx control paneland apply the change), so that's *why* ATi instructed reviewers to turn OFF the nVx tri opts when testing trilinear filtering IQ--because ATi's trilinear opts automatically turn themselves OFF under those software test conditions (colored mip tests, etc.) The fact that ATi's trilinear opts do indeed turn themselves ON and OFF automatically seems totally beyond your ability to understand, for some reason.

Waltc while you do point out some very relevent facts I have to question one bit. If ATi has admitted to using a different algorith as compared to "true" trilinear, why can't we have an option to shut it off just like the Nvidia boards. I understand that it is a smart algorith that turns on and off accordingly, however the user should be given the option over the hardware at anytime. Also, it can tell when colored mip map tests are taking place, and then uses the true trilinear mode, it is not representive of the actual in game performance and image quality.

Please explain to me why ATi has opted not to give the end user control over their hardware. I am not saying Nvidia handled the situation well, but at least now the user has control over their equipment.

This comment was edited on Jul 25, 16:39.
126.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 14:36
Re: WaltC & Hellbinder Jul 25, 2004, 14:36
Jul 25, 2004, 14:36
 
WaltC stop writing so much... especially with your biassed writing no-one will read it all.

You still havnt answered the question.

How can you justify ATi telling game company and tech sites to turn off nVidia's optimization, yet they kept their hush hush the whole time?

Turning ON OFF on its own is still OPTIMIZING ITSELF WHEN IT NEEDS IT! Christ you are pathetic!

Stop posting because your defending guilt now.

125.
 
Re: why is D3 faster on 6800 cards?
Jul 25, 2004, 13:51
Re: why is D3 faster on 6800 cards? Jul 25, 2004, 13:51
Jul 25, 2004, 13:51
 
So, can we all agree now that the 6800 is much better than the x800 in all aspects?

124.
 
Re: WaltC & Hellbinder
Jul 25, 2004, 10:54
Re: WaltC & Hellbinder Jul 25, 2004, 10:54
Jul 25, 2004, 10:54
 
Do you really have to post your ATI-biased non-sense also here in these forums? Have you already completely taking over the rage3d and beyond3d forums so you need to infest other forums as well?

S'ok. I generally expect this kind of response from people who can't read, or if they can read, have no comprehension of what they read. Generally, at B3d, which is a 3d-technology site, I rarely have to deal with this kind of a response. All you've really told me is that you have no clue as to what I've taken the time to talk about here, so you're right--little use in doing so, I suppose, in forums where the content of your posts is beyond comprehension...;)

However, I'm assuming that even in non-technical forums, that when technical subjects are raised there are still a few people capable of understanding what's written, and appreciative of accurate info as opposed to marketing fluff and bs.

* ATI stated in PDFs that they were doing all the time full trilinear filtering (later admitted by themselves that the cards even can't do real trilinear filtering but only a different method which has only in some cases almost the same quality)

* ATI stated in the same PDF that filtering optimizations are wrong and will not be used by them (complete BS as found out 2-3 weeks later)


If you think ATi cards don't do full trilinear filtering at all it's clear you have no comprehension of the issue and don't understand it, and don't understand my previous posts in this thread. Suffice it to say you have no clue on the topic and that you couldn't be more wrong.

FYI, the trilinear filtering optimization algorithms ATi currently uses are very new in their drivers, and *did not exist* at the time ATi wrote the .pdfs you're so fond of mentioning, which were written long before. In point of fact nVidia started doing trilinear filtering optimizations *last year*, for nV3x, originally in UT2K3, long before ATi began a similar, but technically very different approach to trilinear filtering optimizing in '04 with the R420 launch. And for months nVidia refused to even *discuss* the subject even when asked directly about it--that situation is exactly the same for nVidia today, with the company having made *no public statements* explaining how & when nVx tri opts operate, why they are turned ON by default in the drivers instead of turned OFF by default--or *anything* about their opts at all. Within mere days of being asked about their trilinear optimizations, ATi issued a detailed and comprehensive public statement about it--something nVidia has yet to do in any case that I'm aware of. The details and differences between the current approaches by the IHVs are summarized in my previous posts in this thread, and there's nothing I can do if you can't understand the plain English I've used. The issue is only slightly technical, but if you cannot understand my posts clearly you have no hope of grasping the issue as it actually is.

* ATI told all hardware reviewers to disable all nvidia optimizations when comparing cards of both manufacturers to get fair camparisions (of course they didn't tell anyone at that time that ATI is doing optimizations all the time which can't be disabled)

Wrong again--for the last time, ATi's tri opts turn themselves ON and OFF *automatically* in the current drivers (and don't exist at all in earlier Catalyst versions), in a programmed fashion, and *always* turn themselves OFF in the presence of colored mipmap tests, and other similar software conditions, which are the tests reviewers use to check trilinear filtering image quality. Therefore, if you do nothing apart from installing nVidia's drivers, nVx trilinear optimizations are turned ON by default (and the only way to turn them off is to check the appropriate check box in the nVx control paneland apply the change), so that's *why* ATi instructed reviewers to turn OFF the nVx tri opts when testing trilinear filtering IQ--because ATi's trilinear opts automatically turn themselves OFF under those software test conditions (colored mip tests, etc.) The fact that ATi's trilinear opts do indeed turn themselves ON and OFF automatically seems totally beyond your ability to understand, for some reason.

Get over it WaltC and Hellbinder, you are only making yourself look bad here. This is not the average ATI fansite!

First, I'm unaware of "Hellbinder" and I having made joint posts in any forum at any time, including this forum. I do not speak for "Hellbinder," I always only speak for myself. I assume the same is true for "Hellbinder."

Second, although you seem to be saying that the Blue's News forum is *below average*, I would imagine several people who read and post in these forums from time to time would be insulted by your characterization. You might wish to apologize to them, accordingly.

What's sad, I think, is when people who absolutely do not understand the basics of a particular issue *think* that they understand it well enough to refute information which is contrary to their opinions, but which, unlike their opinions, is factual. It's people of that stripe who truly do make any forum anywhere "below average."

It is well known that I cannot err--and so, if you should happen across an error in anything I have written you can be absolutely sure that *I* did not write it!...;)
Avatar 16008
123.
 
Re: Ladies and Gentlemen it is time...
Jul 25, 2004, 10:40
Flo
 
Re: Ladies and Gentlemen it is time... Jul 25, 2004, 10:40
Jul 25, 2004, 10:40
 Flo
 
You're welcome.
Supporter of the "Chewbacca Defense"
122.
 
Re: Ladies and Gentlemen it is time...
Jul 25, 2004, 10:04
Re: Ladies and Gentlemen it is time... Jul 25, 2004, 10:04
Jul 25, 2004, 10:04
 
#115,

I thank you, sir, for posting a comment, among all these trolls, that actually made me laugh.

This comment was edited on Jul 25, 10:05.
121.
 
Re: Ladies and Gentlemen it is time...
Jul 25, 2004, 09:54
Re: Ladies and Gentlemen it is time... Jul 25, 2004, 09:54
Jul 25, 2004, 09:54
 
Flo, I was remembering that when I was in the jury room deliberating and conjugating the Emancipation Proclamation - did it make sense? No. I told the ladies and Gentlemen of the supposed jury it did not make sense. If Chewbacca lived on Endor we had to acquit.

I know they seemed guilty. But I told the ladies and gentlemen this was Chewbacca. "Now think about that for one minute. That does not make sense. Why was Flo talking about Chewbacca when a video card's life is on the line? Why?" I told them why. I didn't know. It didn't make sense. If Chewbacca did not make sense, we had to acquit.

We then looked at the monkey , looked at the silly monkey.

We acquitted the video cards!

Supporting the Chewbacca defense,
Ray

-----
Friends do not let EVIL piglets play Doom 3!
http://users.ign.com/collection/RayMarden
http://www.dvdaficionado.com/dvds.html?cat=1&id=ray_marden
I love you, mom.
Everything is awesome!!!
http://www.kindafunny.com/
I love you, mom.
Avatar 2647
120.
 
Re: why is D3 faster on 6800 cards?
Jul 25, 2004, 09:52
Re: why is D3 faster on 6800 cards? Jul 25, 2004, 09:52
Jul 25, 2004, 09:52
 
<quote>These Farcry benchmarks show that the 6800 GT runs faster than the X800 pro and the X800 XT PE. http://www20.graphics.tomshardware.com/graphic/20040705/farcry-06.html

The 6800 Ultra blows away the X800 XT PR</quote>

that's a misleading link. if you look at all the pages, what you say is false about the x800 XT, although true about the x800 pro. the ultra and XT looks about even (~5fps) when you enable AA and AF

and about the pro in dx9, the folk explanation i've seen is that the 12 pipline decision is to blame, since the GT has 16 with a bit slower clock speed. the same obviously doesn't follow for the doom3 benchmarks, not sure what's going on there.
This comment was edited on Jul 25, 09:56.
119.
 
Re:
Jul 25, 2004, 09:40
Re: Jul 25, 2004, 09:40
Jul 25, 2004, 09:40
 
"The new machine shows the now-infamous light/shadow fuckups with Thief 3 - persistent through what, 4 driver sets now?"

works fine in 4.4. i heard it was broken in 4.5 and never upgraded. not sure if it's fixed in 4.7. and to be fair, i experienced similar flashing in battlefield vietnam on nvidia.

118.
 
Re: why is D3 faster on 6800 cards?
Jul 25, 2004, 09:37
pob
Re: why is D3 faster on 6800 cards? Jul 25, 2004, 09:37
Jul 25, 2004, 09:37
pob
 
"I wouldn't get the 6800 Ultra since the GT can be OC'ed to the Ultra's level. "


Did you read what Carmack said about overclocking?


137 Replies. 7 pages. Viewing page 1.
Newer [  1  2  3  4  5  6  7  ] Older