Oh good. I escaped Zephalephelah wrath once again. Go me!
Was that rambling mess his "wrath"? I can't even tell wtf point he's trying to make..
The problem with all these theories is that they are all mind games. 16:9 in a 22” monitor is WAY WAY SMALLER than 4:3 in a 22” monitor.
Yeah, duder.. no shit. The surface area of the screen is less, but that's not at all relavent. Take, for example, my laptop. It's a 15" widescreen, if it were 15" square it would have greater surface area, but would be totally fucking impractical. The benefits are beyond that of mobility, though, and have their base in function. Since english text moves left to right, the screen becomes more practical when it is longer in that dimension.
Nevermind all that, though. The meat of your post is an irrational rant about the evils of LCD's, swearing up and down that they are the lesser cousins of the godly CRT. First of all, this is bullshit. For one the image on an LCD doesn't flicker in the way a CRT does when viewing a static image. This means you can actually sit in front of your monitor and read through thousands of lines of code without going blind... a substantial benefit when one is force to do so.
Form factor is another obvious benefit. As is power consumption.
At any rate, your entire post is a rant about the fact that a 17" widescreen has less surface are than a 17" "normal" monitor. Noone disputes this, and noone is unaware. It's obvious the second you see a widescreen TV next to a normal one, the reason people choose widescreen monitors is because they're often more practical, as horizontal screenspace is more valuable than vertical (the same reason 4:3 monitors aren't 1:1).
They made the game for 4:3 because they made the game on CRTs that were 4:3.
I hope you know that "made the game" 4:3 merely means they set their renderer viewspace to this size. It would have been a simple computation to properly adjust the viewing frustrum to 16:9 based on the selected resolution, both for cutscene rendering and ingame rendering.
Let me tell you, when the next real operating system comes out that truly begins to harness the power of 64bit & you have to go buy another system that can handle 64Gigs of memory, then a thousand gigs, just I like had to with Megabytes; then you’ll find yourself upgrading every 6 months to keep up with the technology because technology never ever stops.
You have no fucking idea what you're talking about. Memory is not the bottleneck it was 10 years ago, and neither is bus/instruction size. Games will only marginally benefit from a 64bit instruction set, as that level of precision is not at all necessary when performing realtime computations. Yes, there are some hacks which can result in improved speed, but they amount to paralization of 32 bit instructions, which would be better left to a multicore or dedicated vector coprocessor.
Futhermore, driving/refreshing "64 gigs" of memory would result in a significant capacitance, which would begin to affect write and read speeds on the address bus.
I can think of no practical benefit to rewriting an OS kernel with 64bit instructions "in mind", the work it's doing is not high precision, it's high volume. An OS designed with multiple processors in mind (like BeOS, not that you'd know what that was) would be of some benefit, but the benefit to gamers would still be marginal.
Why am I even posting this...
I eat pasta!