The ATI Radeon R3x0 chips only support 24bit PS2.0 shaders.
The Nvidia FX NV3x chips support FX12(12bit), FP16(16bit) and FP32(32bit) PS2.0 shaders.
The DX9.0 spec states that PS2.0 support must be at least 24bit, so in order for the NV3x to meet the spec then it must run in FP32 mode. So NV telling developers to use FP16 and FX12 on DX9.0 games are effectively cheating the spec again...but getting a third party to do it for them!
When it comes to OpenGL however it's really down to the developers to decide what they want to do. In Carmacks case he's obviously looked at the problem and decided that a drop to FX12 precision for his shaders doesn't produce a hugely inferior image, and coded an NV specific path. Any NV3x owners should be grateful for this as you'll get faster FPS than an ATi at virtually no IQ loss.
Sadly once I saw my £400 NV card being trounced on DX9.0 by a £150 ATi card, I finally saw my last gasps of NV loyalty evaporate and I am now the proud owner of an ATi Radeon 9800 Pro 256...heres to HL2!
Finally big thanks to Valve for confirming all the "NV3x has piss-poor PS2.0 performance" rumours - someone big had to step up and publicly humiliate NV otherwise the charade would have continued.