I call bullshit on anyone giving a percentage of the system's "power" that a game uses.
I'm not sure where your skepticism is coming from. As a software developer, I regularly use profiling applications which report in real-time the total % of CPU utilization, the amount of memory being allocated, the number of running threads, etc., giving me a pretty good picture of how efficiently my app is making use of available resources (i.e. the percentage of total "power" being used). Why is it so hard to believe that a professional development house wouldn't be using the same tools? It's a common practice in software development.
In other words, the game should look about twice as impressive on the graphical side of things in the sequel?
As graphics are only one part of the equation, it doesn't necessarily stand to reason that all of the additional resources will be directed at better graphics. The extra processing can be used for things such as more advanced AI routines or more realistic physics.
This comment was edited on Dec 11, 2008, 18:10.