Send News. Want a reply? Read this. More in the FAQ.   News Forum - All Forums - Mobile - PDA - RSS Headlines  RSS Headlines   Twitter  Twitter
User Settings
LAN Parties
Upcoming one-time events:

Regularly scheduled events

Report this Comment
Use this form to report the selected comment to the moderators. Reporting should generally be used only if the comment breaks forum rules.

12. Re: Evening Tech Bits Jan 15, 2013, 08:50 Krizzen
NKD wrote on Jan 15, 2013, 03:23:
"64 bit gaming" on the other hand had nothing materialize for it because the phrase "64 bit gaming" is itself a nice nugget of ignorance created by dimwitted gamers and the gaming press when AMD and Intel started releasing the relevant hardware.

I've always felt the same way, but recently I learned the reality of the different "bit" eras of gaming.

WARNING: Massive techno babble below. Skip near the bottom or just ignore and go about your day. Hope this doesn't piss everyone off being so off-topicish.

Indeed, the most relevant example is that a 64-bit CPU can address up to 2^64 bytes, or a little more than 18 exabytes. By a similar metric, a 64-bit register can store a number up to 18 quintillion and do calculations with two such numbers. Contrast this with a 32-bit architecture at a memory limit of ~4.3 gigabytes and numbers as large as ~4.3 billion.

The real issue with this "64-bit gaming" thing is that we technically haven't transitioned out of 32-bit. Sure, everyday CPUs support operations on multiple 128-bit numbers at the same via SSE extensions, but most compilers see this code as nonstandard and the optimizations must be explicitly specified by the programmer AND the game's code must be setup in a special way to take advantage of the extensions. It's not terribly difficult, but a lot of things don't support many CPU extensions out of the box like SSE, 3DNow, AMD-V, specifically libraries used in a game may not support all CPU optimizations, libraries like DirectX, PhysX, OpenGL, OpenAL, fmod, and Scaleform (Flash) (I'm not saying these DON'T support extensions, but I'd wager none of them guarantee extension support for all their features).

[Quick Rant: In my opinion, lack of support for CPU extensions is the LARGEST offender when it comes to games that are just slow and "unoptimized". This is most apparent in ANY game that uses ALMOST ANY scripting language. Classically, the scripting language is parsed at runtime, meaning while the game is running the game engines reads each line of script to figure out what it does, and then translates it to a byte code the CPU can understand. Since the speed of compilation is VERY important (the main feature of a scripting engine), it is usually sloppy and doesn't include fancy SSE extensions to give LITERALLY (I swear, I'm not kidding) 400% boosts to 3D operations, for example; matrix operations, specifically. Examples of games that have poor performance almost all the time: Flash, using ActionScript; Games using Lua scripts; Games with lots of XML; Game GUIs with web browsers (they usually use the web browser renderer for the game's GUI, either Scaleform or WebKit); Javascript games; Java games; C# games.

Anyway, those things don't usually even take advantage of SSE1 which came out around 1999. This is relevant because all the processor extensions have more-or-less pioneered 64-bit desktop CPUs]

Oh yeah, another big thing about still being in "32-bit" is that GPUs operate almost exclusively on 32-bit floats and physically can't use 64-bit values without special software hacks that would most definitely wreck performance. Recently a LOT of new cards have switched to 64-bit, which is good.

So, to put this all in simple terms (and to go a step further):

CPUs have limits imposed by being primarily 32-bit. These limits are how much memory you can have in your system and how big or small numbers can be. Smaller numbers mean less precision. For example, in a space game I'm making I ran into a problem that when you reached the outside of the galaxy, the ship would start to deform and wobble in strange ways. This is because a 32-bit number just doesn't have the precision to represent the positions of all the vertices of the 3D model. A 64-bit number would have no problem whatsoever.

In the end, true 64-bit gaming would consist of video cards with more than 4.3 GB of memory and very likely more detailed and larger game worlds. It will also eliminate things flickering at extreme distance that you sometimes see in games. The final, and largest difference, will be in performance since most graphics cards (except the very latest) still work exclusively on 32-bit numbers.
Avatar 57568
Login Email   Password Remember Me
If you don't already have a Blue's News user account, you can sign up here.
Forgotten your password? Click here.


.. .. ..

Blue's News logo