Beamer wrote on Mar 11, 2013, 12:25:
There's almost definitely no reason why they couldn't handle it.
But coding something for a server and coding something for a home system tends to be very, very different. It isn't as simple as just flipping a switch.
If you never intended to turn off server-side calculations, why have someone put a few weeks into putting that ability in?
How, specifically, different?
Are you talking about the network traffic layer? Because if that's it, than the solution is simple - the game "server" runs as a service on your local desktop, receiving and sending traffic over the network layer, but simply sending & receiving on localhost.
Are you saying that their is something different in how it is compiled? Modern servers aren't mystical mainframes with a different architecture than a desktop. Both are Von Neumann machines, using very similar CPU and memory architectures (often the same). Modern code is compiled to the processor, not the class (desktop, server). If it runs in one place, it will run in the other.
Are you saying that the code itself couldn't run a single instance, only many? Coding a system that scales to multiples of instances, must, by definition, work in a single instance. How else would the service spool up for the first player?
Maybe the server is written on a *nix server? That would throw a wrinkle in things.
Maybe their are server side only assets? Bundle them up and ship them out. Not terribly difficult.
About the only other thing I see here is that there might be a minimum spec here for either CPU or RAM to run the server "service", but as I've said on other posts that simply cannot be the case here. There is no financial model that would support the near monopoly of a modern server's CPU + RAM for a single $60 purchase.