Not to mention how poorly nvidia cards performed at one time because of valves insistance on using 24 bit color mode and not 32 bit (the only game since the original Baldur's Gate to not support 32bit color).
Umm, I think you have your facts a bit screwed up. That whole 24-bit / 32-bit deal has to do with internal shader precision PER CHANNEL (and there are 4 channels), and has nothing to do with 24-bit/32-bit color output. ATI's cards can process shaders with 96-bit precision (24+24+24+24 for red, green, blue, and alpha) whereas nVidia's cards do use 128-bit precison (32+32+32+32 for rgba). In the end, there's hardly a noticable difference between the two, but for obvious reasons, ATI's cards have an advantage because 96-bit precision is faster than 128-bit precision.
This comment was edited on Aug 19, 17:16.