This is a copy/paste of a post on Rage3D boards by WaltC with regards to Nvideas response. I do this so others can see the excellent retorts Walt has to Nvidea's release without directly linking (the thread) and requiring others to have to register there in order to read it.
>>>>>I want to take a crack at this statement, but first I thought it was interesting that while article at gamersdepot is "signed" by Derek Perez, supposedly, the exact same statement here:http://www.hardocp.com/article.html?art=NTIw
is "signed" by Brian Burke. So we don't even know who wrote it--could have been anybody.
Tackling the statement itself:
"Over the last 24 hours, there has been quite a bit of controversy over comments made by Gabe Newell of Valve at ATIs Shader Day.
During the entire development of Half Life 2, NVIDIA has had close technical contact with Valve regarding the game. However, Valve has not made us aware of the issues Gabe discussed."
I hate to say it, but this sounds incredible to me. There's frankly no way that Valve could have written an optimized code path for nV3x without close and frequent contact with nVidia--and so naturally they are owning up to that as it would be impossible to deny. I find it an incredible statement that nVidia would allege that during all of that time Valve never discussed the issue of DX9 API compliancy. That's absurd on its face, since the entire reason Valve needed to do a special code path for nV3x in the first place was because of nVidia's non DX9-compliant hardware.
"We're confused as to why Valve chose to use Release. 45 (Rel. 45) - because up to two weeks prior to the Shader Day we had been working closely with Valve to ensure that Release 50 (Rel. 50) provides the best experience possible on NVIDIA hardware."
How stupid can nVidia be? Apparently there's no limit. Valve's testing it on *released* nVidia drivers, for crying out loud. The 50's are non-released betas which Newell doesn't like for the reasons he stated--reasons which nVidia seems to deliberately ignore. Sorry, nVidia, but Valve's not going to play your silly "cheat the benchmark games" with "special" non-released drivers---bzzzt, you lose.
You know, this is very similar to the "confusion" nVidia expressed when it was scratching its corporate head and declaring that "FutureMark is out to get us and we don't know why." These guys are slow in the brains department, no doubt about it. Newell's presentation was very precise, to the point that an idiot could understand the specifics. Not nVidia, though..."confused" is right--but certainly not about this...
"Regarding the Half Life2 performance numbers that were published on the web, we believe these performance numbers are invalid because they do not use our Rel. 50 drivers. Engineering efforts on our Rel. 45 drivers stopped months ago in anticipation of Rel. 50. NVIDIA's optimizations for Half Life 2 and other new games are included in our Rel.50 drivers - which reviewers currently have a beta version of today. Rel. 50 is the best driver we've ever built - it includes significant optimizations for the highly-programmable GeForce FX architecture and includes feature and performance benefits for over 100 million NVIDIA GPU customers."
Hey, dummies...according to people trying your beta, non-released 50 Dets...they don't even fix the 5900U flicker that some other non-available beta you've floated around recently apparently does!!!! Knock-knock--anybody home? You guys *are* incredibly confused about many things it seems. SHeeesh. Could you possibly say something as moronic as this?
100 million nVidia customers in your dreams, you morons (I'm sorry, their hyperbole is too much to stomach.) You may have sold 100M graphics chips in the last DECADE, but surely you know not to 100M different people [edit: I've bought 5 nVidia cards myself in the last decade-and why don't we talk about the 99 Million graphics chips out of the 100Million you sold that *can't run* HL2 because they aren't *Dx9*-capable, you morons. (I swear, this statement is so stupid I'm having trouble believing someone at nVidia actually wrote it.)
"Pending detailed information from Valve, we are only aware one bug with Rel. 50 and the version of Half Life 2 that we currently have - this is the fog issue that Gabe refered to in his presentation."
Uh, dummies...what about the screen shot issue which you so coyly ignore? What about that?
"It is not a cheat or an over optimization. Our current drop of Half Life 2 is more than 2 weeks old. NVIDIA's Rel. 50 driver will be public before the game is available. Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great."
Yes, dummies..."specialized work" as in work arounds to partial DX9 code since your hardware isn't DX9-compliant. And your "goal" obviously is to sell as many of your faux-DX9 chips as possible first, and worry about little things like DX9 hardware support later on. Hey, good plan--means you have no responsibility--you can blame it all on driver bugs (as though you didn't write your own driver code), and software developers (as though they don't have to work around your hardware.) Must be nice to rake in the dough from your "DX9" gpus without having to actually support DX9 in hardware. What a great scam. (Assuming of course you can get your yields up and then find enough suckers who'll swallow this song & dance.)
"The optimal code path for ATI and NVIDIA GPUs is different - so trying to test them with the same code path will always disadvantage one or the other. The default settings for each game have been chosen by both the developers and NVIDIA in order to produce the best results for our consumers."
Right...according to Valve and everybody else doing DX9 games the optimal code path for ATi is the DX9 APi and the optimal code path for nV3x is DX8.1. I think that's been proven indisputably in the last year. What Valve was complaining about was your insistence that they try to teach your DX8.1 dog some DX9 tricks. But I guess you couldn't figure that out from his presentation, being "confused" and all...
"In addition to the developer efforts, our driver team has developed a next-generation automatic shader optimizer that vastly improves GeForce FX pixel shader performance across the board. The fruits of these efforts will be seen in our Rel.50 driver release. Many other improvements have also been included in Rel.50, and these were all created either in response to, or in anticipation of the first wave of shipping DirectX 9 titles, such as Half Life 2."
OK, so does this mean your "current generation automatic shader optimizer" is a complete bork up? Wow. Color me impressed with your neato star-trekish terminology [not.]
Basically, you guys at nVidia are saying your nV3x hardware and drivers can be expected to run like crap until *after* a game comes out and you get a few weeks to run your "next-gen auto shader optimizer"--which I assume will "automagically" do all the workarounds for your hardware so that developers can just take it easy... <<<<<<