Is cloud gaming ecological?

I’ve recently stumbled upon a couple of new companys like Onlive or Gaikai (demo) whose primary business model is to stream video games hosted in huge server farms (the “clouds”) over broadband networks to everyones low-powered home computer. And this business model makes me think, not only if I remind myself that today’s video platforms like YouTube already take a huge piece of the global bandwidth usage (somebody once calculated this for youtube last year before they started the high quality video steaming and estimated they stream about 126 Petabytes a month).

No, it also makes me think about ecological issues. Let us compare the possible energy consumption between a “traditional” gamer and a (possible) future online gamer who is using one of these services. I won’t and can’t give you detailed numbers here, but you can probably get an idea where I am heading if you once read Saul Griffith’s game plan – its all about getting the full picture of things.

Let’s start out with the traditional gamer, who has a stationary PC or Laptop with a built-in 3D graphics card, processor and sound system. If he plays video games all his components are very busy: The CPU is calculating the AI and game logic, the graphics card is processing the pixel and vertex shaders rendering billions of pixels, vertexes and polygons every second into a digital image stream, which is then sent to the user’s monitor at the highest possible frame rate. A sound system outputs the game’s voices, music and sound effects with the help of the computer’s built-in sound card. As I said I can’t give you a final number here, every setup is a little different to the other, but you can probably get an idea how much power is used even for an average gamer setup – several hundreds of watt.

How does the online gamer compare to that? Well, the first look is good. The only things this gamer’s computer has to process here are video and sound, and the video actually only has to be decoded from a regular encoded digital format. Most PCs even with a lower GHz rate will be able to accomplish this task. The sound will be, by today’s standards, probably only simple stereo, so no need for a custom sound processor or big sound setup either. I’d guess the usual consumption for this setup would be less than one hundred watts. Sounds great? Maybe, but maybe not.

The thing is that the video signal itself has to be generated first – on a high-end machine or “cloud” of computers. This means that the needed graphics and CPU power consumption is moved from the “client” – the gamer’s PC – to a “server” component – it did not simply vanish. There is not a single computer involved which consumes energy to let the user play, but maybe a huge ball park. And the parts of the ball park which process the game’s contents need extra power. I don’t know how much, but I bet it won’t be little.

Ok, server farms might be better suited for these kind of tasks, you might say, because virtualizing these computing-intensive tasks would mean you could use serveral server instances in parallel and therefor also use their power consumption more ecological… But wait, this is not a simple web server idling most of the time which gets virtualized here, we’re speaking of game virtualization. Remember how the single users PC was under full load while computing the game’s contents? And, how much can the program code of a game which is used to run on a single PC really be virtualized and parallalized? Does every of these online gaming clients needs dedicated hardware in the end…?

Now, lets assume the services managed to work around these problems somehow smartly – the online gamer’s power consumption footprint of course raised already because we learned that his video signal needed to be created somewhere else first which might have costed a lot of power. But we’re still not there – the signal is still in the “cloud” – and its huge! Uncompressed video in true color even with a – by todays standards – lower resolution of 1024 by 768 pixels takes for a smooth experience 75 Megabytes per second! Hell, If I get a 1 MB/s download rate today I’m already happy…

So, of course the video signal needs to be compressed. While the later decompression is not as costly, the compression, especially for real-time video, is and it takes lots of processing power and a very good codec like H.264. Special, dedicated hardware might do this task faster than an average Joe’s PC hardware components, but this hardware still needs extra power which we need to consider.

Are we done with the online gamer? Well, almost, the video signal is created, compressed and ready for transport, but it hasn’t yet been transported. We need the good old internet for this and send the constant huge stream of packets over dozens of hops and myriads of cables to him. Every extra hardware which is needed for this extra network load again needs hundreds, if not thousands of watts. Of course not exclusive, but the partial bandwidth and power consumption of these components is surely different if you browse a website, listen to online radio or stream a full-screen video.

As I said multiple times, I can’t give you any detailed numbers, but I really, really have the bad feeling that the whole idea of game virtualization is just a big, fat waste of resources – especially energy.

One thought on “Is cloud gaming ecological?”

  1. I have just come across your article while considering the pros and cons of current GPU wattage, and the future of cloud computing.

    The fastest graphics card on the planet, the Nvidia GTX295 reaches 289 Watts at peak usage, which places total wattage on the PC at around 500 Watts when gaming at home. Add to this the fact that, on occasion, you will be participating in online multiplayer sessions, the total energy consumption is immense.

    Now consider how much this same home desktop computer consumes when idle. Unfortunately, most GPUs do not scale down very well and obtain only moderate energy savings when performing regular browsing and office duties. Compared to an energy star-rated business laptop, they consume far more energy. My X61 consumes around 15 Watts and is therefore incomparably more economic than my desktop performing light-weight tasks (~250 Watts), as long as I do not replace it before it dies following present consumerist trends.

    If I could ditch my desktop (which is for gaming, web dev, and video) in favour of diminished performance in video but with the ability to offload gaming hardware requirements, then the benefit to me personally would be great, given say a futuristic 500Mb/s connection (the maximum available to me is 20Mb/s on a fibre-optic connection to my flat).

    I would like to know how much energy is consumed in data transmission when cloud computing process-intensive tasks but I wonder if the benefits will still be quite significant, given that my gaming desktop will no longer be used. When you consider that gaming hardware and energy requirements (now ~ 500W at the top end) will only increase, surely it is best to offload data processing to server farms in your city (like Gaikai in LA) or others located close by.

Comments are closed.