Waltc while you do point out some very relevent facts I have to question one bit. If ATi has admitted to using a different algorith as compared to "true" trilinear, why can't we have an option to shut it off just like the Nvidia boards. I understand that it is a smart algorith that turns on and off accordingly, however the user should be given the option over the hardware at anytime. Also, it can tell when colored mip map tests are taking place, and then uses the true trilinear mode, it is not representive of the actual in game performance and image quality.
Please explain to me why ATi has opted not to give the end user control over their hardware. I am not saying Nvidia handled the situation well, but at least now the user has control over their equipment.The answer to this question is so apparent, at least to me, that I'm surprised it gets asked so much...;) Most people who ask the question, I think, just haven't taken a moment to think it through from a practical standpoint.
Let's say that you are running the nVidia drivers in their default state with trilinear opts ON, and you're playing a game through a scene in which the trilinear opts become apparent and you begin to see mipmap boundaries, or other instances of rendering IQ anomalies, which you suspect are related to the trilinear optimizations turned ON in your drivers. If you wish to replay the game scene with nVidia's tri opts turned OFF, what is there to do except to (a) shut down the game, (b) go to the desktop and open the nVidia control panel, (c) check the trilinear opts OFF check box, (d) apply the change, (e)reboot the game and (f) replay the scene? Conversely, if you want to turn the trilinear opts back ON later in the game you have to repeat the process in reverse to turn them back ON. Rinse & repeat, endlessly.
Pretty impractical, it seems to me, and that's the whole problem with nVidia's "dumb" (manual) ON/OFF approach to trilinear opts. At the best the scheme is inconvenient and distracting to game play.
And that's why, in my opinion, ATi opted to go with the automatic approach to programmed switching of the trilinear opts on and & off during all 3d games. In most all 3d games I'm aware of, there are scenes within those games where trilinear opts may be used with no observable detriment to trilinear rendering IQ, and scenes where the optimization is not enough and standard trilinear needs to be used in order to maintain the desired IQ. ATi's on-off automatic, "intelligent" method provides an end user with that level of support--nVidia's manual method does not, as I've described.
People are, I think, not seeing the issue clearly because they are thinking of it in terms of either running a game with them off or running a game with them on (like vsync or FSAA or AF, etc.) But the case for trilinear optimizations is different by nature, because the goal of them is that they be indistiguishable in IQ from standard trilinear filtering while providing additional performance. Since practically all 3d games contain scenes in which trilinear optimization is appropriate and may be used without IQ degradation, and scenes where this is not the case and standard trilinear is required to maintain IQ, then the only method for implementing trilinear opts *that is of value to the end user* is the method ATi has employed of the automatic on-off, self-switching trilinear optimization. So, there's a very persuasive practical reason why ATi's doing what it's doing, imo.
By comparison, what's nVidia doing here? When nV first began using trilinear opts they built it into their drivers last year with *no* manual turn-off switch (never mind that nVidia never told a soul about it in advance.) For awhile, in UT2K3 exclusively, there was *no way* to get trilinear filtering on detail textures at all--either from the nVidia control panel *or* through the game itself--you always got the optimization in the case of UT2K3.
Later, nVidia expanded this trilinear optimization to all 3d games, again, sans any OFF switch anywhere. Still later yet, nVidia built in an OFF switch for the optimizations in its control panel, but if you'll remember the first stab or so at doing that, nV reported the OFF switch was broken and simply didn't function. That indicated to me that it wasn't as easy for nVidia to actually turn off these optimizations in its drivers as the on/off cpanel check box implies it should be (evidently it was easier for nVidia to put the tic box into the Cpanel GUI than it was to make it functional)--else the OFF switch would have worked in all of its drivers from the start, and nVidia would have had an OFF switch built in from day one. The thing to remember is that originally nV's tri-opts driver implementation was never meant to be turned off, either by manual control from the cpanel, or from within a game itself.
Presumably, now, all of that has been fixed at last, and the tri-opts off switch in the nV Cpanel is functional. Even if true, and the nV tri-opts OFF Cpanel switch does indeed turn all of them off in every 3d game, this still does not alleviate the practical problems in nV's manual on-off approach as I relate above.
Last, I want to reiterate that my complaint about Carmack's presentation was not just that I think he completely mischaracterized what ATi's doing, but my complaint primarily about his remarks is that he didn't even *mention* nVidia's "filtering fudging" (as he puts it), even to the point of saying something like this:
"nVidia's implementing trilinear filter fudging, too, in its drivers by default. But nVidia provides a manual tri-opts OFF switch via its control panel and, while not as eloquent perhaps as ATi's intelligent switching algorithm, it does at least allow the user to turn them all off. In the current D3 build on which this test is based, with the nVidia drivers I've tested, I have personally verified that the OFF switch works in D3 and the nV drivers do not do trilinear optimizing in D3,
so long as they are turned off via the control panel before running the game."
Instead of saying anything remotely like this--Carmack is as silent as a tomb concerning even a *mention* of nV's "filter fudging." Doesn't mention it
at all.I found that remarkable, and of course it certainly cannot be because JC doesn't *know* nV's drivers do trilinear optimization by *default*--certainly not...:D He knows it --of that I have absolutely no doubt. So why the silent treatment--especially when he did bother to mention ATi's trilinear opts approach and bothered to characerize it as "fudging"?
Heh...;) To know the answer to that I'd have to be a fly on the wall of JC's mind, and I certainly am not that...;) But I do have a charitable theory as to why he was as silent as the grave on the subject of nVidia's trilinear opts, which is:
[Charitable Theory /ON]
He did personally test the nV-control panel trilinear opts OFF switch for D3 during these tests, and
found that it did not disable all of the trilinear optimizations the nV drivers force for D3. He may well have found that it disabled some, but not all of them. I would imagine that next he would have asked nVidia about it, and that nVidia would have given him the predictible answer that, "It's a bug that we'll address in upcoming drivers," and that Carmack simply chose to let it go and say *nothing* about nVidia's trilinear optimizations at all simply to avoid a public confrontation with nVidia over the issue, since he could not be 100% sure himself that a "bug" was not indeed responsible for the behavior he observed.
[Charitable theory /OFF]
Still, even this theory leaves unanswered the central question as far as I'm concerned: why mention ATi's trilinear opts in any capacity whatever if you're not going to mention nV's trilinear optimizations in any capacity? Which brings me to an uncharitable theory:
[Uncharitable theory /ON]
This was a nV-sponsored, TWIMTBP exercise and was never meant to do anything except to appear as an objective test (much like the [H] 2003 Doom 3 Preview), the actual purpose of which was to serve as a marketing platform for nV 3d-cards, and it was known as a foregone conclusion that nV would either win, or be made to win the benchmarks. This would certainly explain why Carmack would mention ATi's trilinear-filtering optimizations in a negative light (as "fudging"), while not mentioning nV's fudging at all, wouldn't it?
I mean, for Carmack to state he didn't see any "egregious cheating" going on certainly doesn't rule out that he detected some non-egregious cheating--but then we'd have to understand what JC means by "egregious cheating" in the first place, which he doesn't define for us. But I have to question how much stock we might put into such a definition, seeing as how the fact that nV drivers by default engage in the same general kind of "filter fudging" as the ATi drivers, but Carmack only notices the ATi optimizations in his remarks pertinent to this event.
[uncharitable theory /off]
People will have to decide it for themselves, certainly.
It is well known that I cannot err--and so, if you should happen across an error in anything I have written you can be absolutely sure that *I* did not write it!...;)