He is not forcing ATI's to run at higher precision, I believe it just can NOT do lower precision(someone correct me if I am wrong?). Or, if it does lower, it is under 16bit, maybe 12 or 8 bit precision, which does have a noticable difference for what he is doing?(indented/italic)
Sorry, what went I meant was is that they are not forcing it, but by what he said about the precision not being needed it makes it sound like leaving it at default high quality is unnecessary for this game. Would you run a game with 12x AA if 6x AA had no visual difference, would you?
I understand the reason for the special path for nvidia to make it run, but if it has the same visual quality as the default DX9 path the Radeon uses, but using lower precision, and possibly outpeforms the Radeon, it makes it look like the GFX beats the Radeon, even though the Radeon possibly would beat it if it had it's on special low path.
I just hope no one decides there purchase choice based on Doom benchmarks with the GFX wining with the low path.