Again, to clarify the situation:
The idea behind standards like DirectX and Pixel-Shader specs is to provide two things:
1) Provide a standard set of programming functions and interfaces for programmers to use
2) Provide a standard set of features that all hardware should implement completely, if it wants to claim that it "meets the spec". Ideally, the features should all act / look the same - though performance will probably differ between products.
The idea behind Microsoft having these standards, is that it can use its market position, money, and influence to coerce hardware-makers to conform to the spec a LOT better than trying to get them to do so voluntarily. OpenGL is based around committees and consortiums of manufacturers & developers. As such, it takes longer for revisions in the spec to be approved; and it is up to each individual to implement and "conform to" the spec (and look at how many years its been, and how many chips & drivers STILL don't implement all of the older OpenGL stuff properly). Having a monopolistic power like MS drive development may not be the ideal solution - but it gets results; and for gamers its a "win".
According to what we've seen, nVidia's card does NOT meet the PS 2.0 spec of using FP24 (24 bit) precision. Therefore, a true apples-to-apples comparison of DirectX9 / PS 2.0 features leaves nVidia behind; because it must deal with the "doubled-up 16 bit mode" that it has to run to meet/exceed the spec'ed 24 bit mode. ATi runs this mode natively; and doesn't suffer any performance penalties as a result.
So using the STANDARD as a measuring-tool, ATi steps out ahead of nVidia. Can nVidia beat ATi? Certainly, under certain circumstances: When not using DX9, PS2.0 shaders, or anything higher than 16bit precision for effects - nVidia can still out-perform ATi.
The thing is, as games get more complex, developers are going to have less and less time to devote to "hand coding" alternatives and special code-paths. The "standards" are becoming more popular and more important all the time; and increases in game-complexity will only drive this further along.
Look at how much automation and specialization is being built into art & architecture tools for games these days... Soon, some of those automated / "assistance" aspects are going to need to be incorportated into programming environments, to enable games to be developed on a reasonable time-scale. Engine-licensing was a "first step" in this direction - automation via the use of other people's existing code. However, this can be awkward; and the learning time for someone else's engine eats into the time-benefits of licensing it in the first place. Furthermore, you are constrained to all the limits and compromises someone else made; and they usually made them without ANY idea of what you want to do with their code.
I forsee eventually having integrated development environments that largely automate the basic processes of coding a graphics engine; and the increasing adoption & conformity to "standards" will help this as well - as the compilers and development tools will be MUCH more feasible if they can create "vanilla" code (automatically or with minimal direction) that works for everyone (and since its defined by you, it doesn't suffer the problems of being "someone else's code" like engine licensing). The dominant paradigm will then shift to programming as a means to "define the rules" of the game (physics, interaction, movement, victory conditions, etc); and you will see a "split" in the game-programmer role... Part of these people will end up working on the "developemnt environment" and "tools" - pushing the latest and greatest visuals & physics routines, and optimizing the automated aspects of the development tools. The others will become more like "game designers who code", in a sense that their programming will have much more to do with gameplay, and less with hardware access, memory-management, etc. Art will still be the biggest bottleneck in games development; although the tools for that are getting better - however increases in detail/fidelity are still outpacing the development of most art/architecture tools.
Wow, talk about a rambling topic-diversion!
Take care,
--Noel "HB" Wade