Maxis tells it’s players that they’ve moved some of their intensive computations for the new Sim City off your computer and on to their servers, isn’t that nice of them? Of course that means that you can’t play the game without an internet connection and that’s caused long delays as players queue up to get a spot on Maxis’ servers to play a ostensibly offline game. Then rumors surface that the online portion doesn’t actually do anything. Then it’s proven, and the gaming world blows up in a rage.
The thing I don’t get is why did anyone fall for this in the first place? You’re telling me that you have a super server that can receive thousands of simultaneous player’s data from all across the world, process it, send it back all across a potentially slow internet faster than the individual computers could just do the processing? Even for the simplest computation transfer speeds alone are going to be a prohibitive bottleneck dragging down frame rates regardless of computations.
I worked briefly for Sun in the late 90s. Back then they had a system where you didn’t have an individual computer, you had a dumb terminal and all your files and applications were on the server. It was an interesting idea that only really worked when no one was asking much of the system. And that was all on a local network. Earlier systems were even worse with tales coming out that single users hitting compile sent everyone on a break.
But this isn’t then, it’s now, and now we’ve virtual machines and linked multi-core processors sharing the load. Surely we’re past the days of the end user feeling server load, right? Of course not. The moment the number of requested computations exceeds the number of processors available there will have to be some necessary queuing and waiting going on in the system which, by the way, is going to occupy at least part of another processor to do. (The solution becomes part of the problem!)
The one time I’ve seen this sort of offloading of processing work, partially, is voice recognition services like Siri or Android’s RecognitionListener (and it’s many applications). In these cases the computers in people’s hands literally does not have the power to do the processing necessary. In this case the second or two it takes to transfer, process, and send a response back is totally acceptable and completely functional. Other than that the only good application of the cloud in general I’ve seen is data backup, but there has to be a local storage to match or forget it. The last thing I want is my internet connection to go down and lose access to my data.
What this boils down to is DRM, and DRM sucks. Either they want the user’s computer to check in and verify it’s legitimacy or they’re paranoid that if they release their program someone will reverse engineer it and find out how it works. Granted this is a perfect scheme to keep your trade secrets, particularly if they’re not all that impressive computationally and it’s employed by many SAS applications. But it sucks from an end user perspective. But there’s the rub. As companies are trying hard and harder to tighten their grip we’ll see this sort of paranoid options driving the decision makers. Of course this means the solution is the same as in other cases: Don’t buy it. Turn your back on the marketing magnet, make the sacrifice, don’t buy or play games that force and always-on internet connection, seek out alternatives to SAS applications and use them, and let these bad decisions fester and die.