Eric S. Raymond, the Open Source guru,
comments
on Slashdot (thanks Jacek Fedoryñski) on the controversy about the
potential for cheating described in
this Slashdot article that's been raised by the release of the Quake source code (
story). The new article discusses cheating in computer games in
general, and in particular, one of John Carmack's proposed ways of
addressing this (
story), a closed-source launcher for this now open-source game. Here's a bit:
Carmack's argument seems watertight. What's wrong with this picture? Are
we really looking at a demonstration that closed source is necessary for
security? And if not, what can we learn about securing our systems from the
Quake case?
I think one major lesson is simple. It's this: if you want a really secure
system, you can't trade away security to get performance. Quake makes this trade
by sending anticipatory information for the client to cache in order to lower
its update rate. Carmack read this essay in draft and commented "With a
sub-100 msec ping and extremely steady latency, it would be possible to force a
synchronous update with no extra information at all, but in the world of 200-400
msec latency [and] low bandwidth modems, it just plain wouldn't work." So
it may have been a necessary choice under the constraints for which Quake was
designed, but it violates the first rule of good security design: minimum
disclosure.
When you do that, you should expect to get cracked, whether your client is open
or closed -- and, indeed, Carmack himself points out that the see-around-corners
cheat can be implemented by a scanner proxy sitting between a closed client and
the server and filtering communications from server to client.
Here's
how the article concludes:
To recap, the real lessons of the Quake cheats
are (a) never trust a client program to be honest, (b) you can't have real
security if you trade it away to get performance, (c) real security comes not
from obscurity but from minimum disclosure, and most importantly (d) only open
source can force designers to use provably secure methods.
So, far from being a telling strike against open source, the case of the Quake
cheats actually highlights the kinds of biases and subtle design errors that
creep into software when it's designed for closed-source distribution and
performance at the expense of security. These may be something we can live with
in a shoot-em-up, but they're not tolerable in the running gears of the
information economy. Avoiding them is, in fact, a good reason for software
consumers to demand open source for anything more mission-critical than
a Quake game.
Finally (for the moment), the
preliminary
QuakeWorld Forever proposal describes plans to use "a Netrek type
system of blessed 'binaries'" to create QuakeWorld security based on
concepts found in this
mostly
unedited
log of a chat on #qwforever on GamesNET (209.1.245.35:6667).
Here
is a copy of the email they sent along.