Prez wrote on Dec 11, 2012, 16:58:
It's not a matter of letting someone else choose your settings for you. It's about not having to take the time to find the settings 'sweet spot' for your system that will give the best possible visuals while keeping the game playable. I really don't enjoy the tedious process of turning features on or off one at a time then seeing what affect it has on my framerate. I'd rather, you know, play the game.
If you have a beast of a machine that laughs at all the latest games on ultra graphics settings then sure, this is a waste. For some of us who build budget rigs because it's all we can afford, however, having a program that tells you what kind of FSAA and Anisitropic filtering your machine can handle (if at all) would be a godsend.
That's fair enough, but even so, by what standards is Nvidia going to determine this? What do they consider optimum? What do they consider playable? Is a 10% image increase worth a 5% performance drop? Vice Versa? Would you rather have 10 extra fps and live with a few jaggies?
I've always had a pretty damn good graphics card, so I typically just turn everything to ultra, play it for a bit, then if it's too slow, I'll turn off the obvious performance hogs (SSAO and FSAA). Most of the time that makes it more than playable for me, but I'm fine as long as it doesn't dip under 30.
All these things are really personal preference, which Nvidia isn't going to know. Though I suppose they could eventually program the thing to learn from your preferences.
Has it occurred to all of you hardware gurus that not everyone bothers learning what all of these graphics terms mean or how they work? When people start talking graphics terms my eyes glaze over. I have ZERO interest in any of it.
I... what... but...
You... YOU HEATHEN! YOU FILTHY, UNWASHED... XBOX PLAYER! OUT! OUT I SAY!
Creston