Blue's News artwork by Walter |2| Costinak <2@2design.org>

Please visit our sponsors and buy lots of their products. Also, you should floss. ignlogo.gif (1565 bytes)

Mailbag
August 24, 1998 -- Previous Mailbag

From: Karl
Subject: MailBag Format Feedback

I find the mailbag section of your page interesting and entertaining, but I find that the two-column format distracting while trying to read the letters.

From: Seumas McNally
Subject: FPS in the MailBag

What seems to be missing from the discussion about frames per second is Motion Blur. Motion pictures look realistic at 24 FPS primarily because each frame incorporates motion blur. It isn't usually an infinitely short snapshot of time, but a relatively slowly exposed snapshot of nearly the entire 24th of a second it relates to. Since things are moving during that time, they blur.

There definitely IS an advantage to having a graphics card that can run a game at 100-200 FPS. If your monitor can display that speed, the scene will look more realistic because your eyes (with their relatively slow refresh speed) will see blur in the extremely fast updates. Another option is to use those extra updates available due to the horsepower of the graphics card and/or CPU to combine multiple scene updates in the graphics card's buffer, creating a motion blurred image, before displaying it on the screen. The ultimate for a 60hz monitor would be a system that could theoretically compute 240 frames per second, as it could then display a fully motion blurred frame 60 times per second, for a viewing experience that would rival a live TV image for realism.

Knowing game developers, however, it will probably be a long while before we see that sort of smoothness of motion, as they seem keen on using as much of the available graphical power as possible for more detail per frame and higher rendering quality.

From: Jim Collier
Subject: FPS Rebuttal

OK, I must give my rebuttal to the arguments against my FPS stance (being "higher FPS *IS* better").

Response to Adam: I never mentioned deathmatch modes or complex areas. Therefore I made no comments as such to rebut! Of course deathmatch and complexity will slow the FPS down. In fact, when I deathmatch, I cap FPS at 60 to reduce traffic.

Response to Czar: Claims such as "IT IS PROVEN BIOLOGY" don't impress me, nor should they convince anybody. Specific references to such "proof", that we can check out for ourselves might be persuasive. Pseudo-science talk doesn't impress me either. This is not a flame on you, but on your weak argument. Big words and apparent self-inclusion into the scientific community do not a good argument make. You need substance--references to controlled statistically significant studies. Your argument got confusing after the midway point, and is also riddled with logical fallacies; for example, the falsely dichotomous question posed at the end.

And you still did not address my "experiment". In fact, I performed another one. I went farther on this one and made it "DOUBLE-blind" (even *I* didn't know the FPS that was being tested [until later], nor did I observe the experiment in action). I also increased the sample size from 1 to 5...still nowhere near enough to offer statistical significance, but enough to form the basis of a good albeit not bulletproof argument.

Using the "cl_maxfps" setting, each person (none of whom had played Quake II before) tested four modes: 15 fps, 30 fps, 60 fps, and 120 (in random order). Test machine was an overclocked P300, SLI Voodoo2, 1024x768 at 120Hz. Subjects first got acquainted to Quake II in software mode. Then they instructed to notice playability and overall visual quality in 3dfx mode--they were told nothing else, and no mention was made of the concept of FPS. However most of the subjects quickly figured out what was being manipulated, if not able to put it into exact words.

In 120, 60, and sometimes even 30 fps modes, the actual FPS would occasionally drop below the capped limit (and I seriously doubt it ever came near 120). I'm sure of this for the same reason I'm sure that higher FPS=better: quantitative and qualitative comments by the test subjects, and my own observations of the testing environment the subjects were exposed to.

The results? All modes were clearly distinguished and universally ranked in order of preference: 15, 30, 60, 120. To me, the 120 mode didn't have as big a difference over 60 as the other increments did, but I know this machine was never coming anywhere close to 120 FPS. But it was quite noticeable. 15 FPS was predominately had comments such as "annoying" and "disorienting". Almost everyone was amazed at the 120 mode and had no idea computer games were up to that level of realism.

Please explain these results to me, Czar. Tell me how your facts of biology can accommodate (or refute) these results. I didn't take your argument personally, I took it as a challenge to my own convictions. Now they are reinforced, but I'll always be open to better evidence and better arguments to the contrary. Can you provide them?

I must re-iterate that I have no argument against the idea that the human eye can only see so much. It makes sense to me. But I disagree that it can be compared to a FPS measurement, although I have nothing to refute such a claim. In fact, as little as the brain is understood, I don't see how anyone could state exactly HOW human vision is limited in that area (although it surely is somehow). I am not suggesting that my experiment refutes the notion that human eyes max out at any FPS. I'm only suggesting that VIDEO CARD FPS *CAN* be noticed past 90 FPS, and DOES make a difference in quality of the gaming experience. If there is in fact a max "eye FPS", then there must be other factor involved, such as "brain" motion-blurring (as a hypothetical example).

On Alaric G. Weigle's argument: Thanks for the support, even though it wasn't as a favor. I disagree with your statement though that games can be enjoyable at a solid, consistent 15 FPS: My experiment suggests this isn't so. My own feeling, trying Quake II with a 15 fps cap (a solid, consistent 15 FPS), was, "uhg", even though I HAVE played many games for long times at 15 FPS or less.

I agree with the ultra-high FPS movies seeming much more real and inducing motion-sickness: I watched (many times) a short "demo" movie at an experimental theatre (set up for a short time--strangely enough--in two Showbiz Pizza Places; one in Dallas, one in LA). I think the FPS was 60 (and I think they used regular 35mm film stock). It was absolutely stunning! Unlike anything I'd seen or seen since (including IMAX [regular or 3D]). The film rolls were gigantic. They mentioned something about wanting to go higher but they were at the limits of conventional technology (this was about 10 years ago). I'm convinced the ultimate movie would be shot at 120 FPS, in 3D, and in a digital equivalent resolution to 70mm+ film stock. With DVD and HDTV, the movie industry is going to eventually have to come out of the stone age of 24 FPS [thanks Blue], 35mm analog film movies.

I also agree with Weigle's argument that higher FPS = deathmatch advantage. When I play on the internet, I manually cap at 60FPS. When I play on LAN, I manually cap at 90. BIG difference! Whether a controlled study would reveal more frags or not I could not take a guess at, but I definitely feel much more in control. That confidence alone makes a huge difference all by itself.

I know this isn't the end of the issue. But hopefully I've injected a hint of rationality into the topic. I welcome any personal arguments/comments/flames at jimcollier@earthlink.net. Thanks.

From: Mitch Withrow
Subject: Reality in Computer Games

Having read innumerable posts on web boards about which game is the most realistic, the answer is: none of them.

You only use two senses to play a computer game. In reality you use all your senses.

No computer game is even close to reality, especially those set on other planets. Unreal is a good name for a computer game because it is unreal.

Another issue that torques my jaw is which game has the most realistic AI. The answer is: none of them. After decades of the most intense research by top scientists using the most advanced equipment ever made, they still do not have artificial intelligence that can even compare with human intelligence.

Most of the people engaged in these debates have completely unrealistic expectations of computer games. Developers cannot now, and probably for years, will not be able to give true reality or realistic AI. It just isn't possible right now and for the foreseeable future.

Don't get me wrong, I LOVE computer games and I like to see things realistically but complaining about not having an unlimited number of bullet holes showing in the level is downright stupid. Take a breather from this pointless debate and just go in and kill things, which is the point of the games we love so much.

From: David Chase
Subject: Mailbag

Ok, I couldn't resist here. Alaric's got some good points about FPS rates -- you can notice the difference between 30FPS and 60FPS when doing a spinning-180-degree-rocket-jump-backflip in Q2. But he ends his message with the statement: 'The fps argument is very similar to the old (now laughed at) argument of "Why go higher then 16.7 million colors, the human eye can't distinguish the difference anyway"'. Uhm, it cannot. 32-bit color is 24-bit color with an 8-bit transparency (alpha) channel. You still have a maximum of 16.7 million colors.

From: Pete Ness
Subject: Response to "Ping Plotter - Bad Bad!"

I just checked out the mailbag... Quote from Grendel:

"While this sounds like a good idea, it is really a very poor tool. "

There are no suggestions as to what a BETTER tool might be - or explicitly what the "problem" with using a traceroute is. Network administrators have been using traceroute to troubleshoot network problems since the beginning of networking time (ok, maybe not *quite* that long) - and it's a very effective tool.

If Grendel thinks the problem is the TRAFFIC caused by the trace route, then what does he think about the traffic caused when I play Quake? The traffic caused by programs like Ping Plotter are HUGELY less than Quake. If I do a quick trace route before I hop on a server, I'm not affecting the network throughput nearly as much with Ping Plotter as I am once I connect to the Quake server.

If Grendel thinks the problem with trace route software is that it sometimes reports packet loss - because a router is under heavy load and it decides to prioritize out the ICMP (trace route) packet - then this is also information that I want to know. I'm not sure I want to trust my Quake game to an overloaded router (especially if this particular match is particularly important).

Also, if a router is dropping packets that have just "expired" (because the trace route software is trying to measure performance from that router), then only that hop in the route should be affected - so you see packet loss on this one hop only, but none farther on down the line. This recognizable and offers you information about that route - you KNOW that there's a router that's under heavy load, but you're not losing packets to the final destination. This is excellent information - if load gets higher on that router, there's some chance that other packets are going to be lost as well.

Technical note: Trace routes do NOT send ICMP echo requests to every host in the path. Trace routes send X number of requests to the final destination host - but they change the TTL (time to live) in each request so that one request "expires" at each host. The difference here may not sound particularly great, but it is. If you directly use "PING" to go to a host that's a router, you often get a no response and widely varying performance. Using an incrementing TTL destined for the final host gives much more reliable numbers - although there are some routers that prioritize out expired packets as well under heavy loads.

Obviously, as the author of Ping Plotter, I can't help but respond to something titled: "Ping Plotter - Bad Bad!".

I wrote Ping Plotter specifically because my ISP would have really crappy performance in the evenings - but it would be OK during the day. I effectively used it to prove to my ISP that there was a network problem - after which they upgraded their connection to the internet. (Maybe it wasn't TOTALLY me that caused them to do this :), but my daily bitch mails with exact information probably helped).

I mean no disrespect to Grendel here. It may be that he has knowledge that I don't.

From: Danate[WCS]
Subject: Games, HTML, And Dr. Pepper???? what! man Get some Dew it's the drink of Champions. GIMME MY CAFFEINE!!!!!!!!!!@#!$#%#$%#$^%

Title says it all......

From: Justin Williams
Subject: YES! Dr.Pepper! Rock on man!!!

>=)