Skip to main content

Tech Focus: Optimising for the Cloud

Digital Foundry on understanding the challenges of cloud gaming development

OnLive almost certainly suggests a certain spec level for developers to adhere to and the key here would be to optimise to that platform and get as close to 60Hz performance as possible. While OnLive beams across a 60Hz 720p video stream to its customers, it's fair to say that only a few games consistently hit the target 60FPS, resulting in judder with motion and a variance in response. In our testing, Assassin's Creed 2 was a particular offender and we saw response between 150ms to 216ms - and that's without factoring in additional lag from flatpanel displays which can be anything from 16ms to over 50ms.

Moving to a consistent 60Hz will undoubtedly help reduce lag, but there are plenty of programming techniques that can reduce internal latency - our recent Battle Against Latency feature describes how this can be measured and talks about how Criterion Games worked to reduce lag on Need for Speed: Hot Pursuit (running at 30FPS) to within one frame of controller response from the 60FPS Burnout Paradise.

At the end of the day, the exercise here is to recognise that the cloud adds an additional overhead to what's already in the game, so the lower the internal latency, the less noticeable it will be once it's plumbed into the cloud architecture.

Programmer-side optimisations can definitely make a difference. The PC version of Need for Speed: Hot Pursuit could easily hit a sustained 60Hz on what we believe is the target OnLive platform, and it offers a phenomenal response of just 50ms - pretty much a best case scenario for adding cloud infrastructure on top. By contrast, with Epic's Bulletstorm we measured controller latency at 83ms (on a Core i7/580GTX system) up against 133ms on the Xbox 360. That's a difference of three frames or 50ms, and cloud architecture can get a hell of a lot done in that window: the frame could be fully encoded and well on its way to the client in that timescale. It's by making the most of this difference that the latency gap between the home and cloud experiences can be plugged.

The exercise here is to recognise that the cloud adds an additional overhead to what's already in the game, so the lower the internal latency, the less noticeable it will be once it's plumbed into the cloud architecture.

At this point, the question is: is all the extra effort really worth it? Will the target customer notice or even care? From a visual perspective, there are literally hundreds of millions - perhaps billions - of people who would happily watch a movie on YouTube rather than watch the same film on DVD or Blu-ray. They are happy to exchange fidelity for convenience, and in a world where even standard def satellite MPEG2 TV imagery is awash with macroblocks, it may well be that non-gamers don't really care so much about the quality of the visuals so long as the experience is fun.

Similarly, an OnLive latency of 150ms-200ms may well have zero relevance to this audience as long as it "feels" playable and predictably, company boss Steve Perlman has come up with this line of argument in recent comments. However, I feel that it is easy to dismiss casual gamers are being non-discerning players, but I feel that there is a real danger here in doing so.

Our input latency measurements from the US launch of OnLive. In a best case scenario we get 150ms. Not great by local standards, but pretty much a bona fide miracle from a technological standpoint. However, the variations on a title by title basis can be alarming.

Let's talk about latency first. It's all very well to describe a game as being "playable" but this does a disservice to a lot of the nuances that designers put into their all-important control systems. If latency isn't an issue, why did Criterion put so much effort into making the most responsive racing game of the modern age? After all, Need for Speed is a brand with a great deal of penetration into the mainstream audience that OnLive must surely covet. Why did Guerrilla Games cut down lag so much between Killzone 2 and Killzone 3? The answer in both cases is really straightforward: because it matters, because it made the games more fun.

The reality is that there's a gulf between a game being functionally playable, and feeling "right". In my recent OnLive test session on Eurogamer's 100mbps line, Red Faction Armageddon simply wasn't as fun as it was on 360 and PS3 because aiming - the most basic game mechanic - was compromised to a certain extent by the lag and it's difficult to imagine how Volition could have improved it. I'm reminded of the ways Guerrilla Games tried and ultimately failed to improve Killzone 2 controller lag (which at 150ms was identical to the best I've personally seen from OnLive) and only resolved it via extensive rewrites for the sequel. This raises some interesting questions - if a game is off, if it doesn't feel right, is the user more likely to blame the game and the developer rather than the infrastructure?

In the here and now developers and publishers face interesting choices: does the current strategy of porting to OnLive and to an extent hoping for the best actually work? Can image quality and performance be improved along some of the lines suggested, or should it simply be accepted that some games can work rather well on the current delivery system while others produce a sub-optimal experience and should perhaps remain on conventional platforms only?

Going forward, if the future is a cloud-based system, it will be interesting to see how games built from the ground up using this delivery system benefit from bespoke optimisation. OnLive essentially works by interfacing a GPU to a hardware encoder. What if the developer itself is in charge of the video encoding directly? Long term, it'll also be interesting to see whether compressed video is the best transmission system, or whether server-side power can be used in tandem with a client-side renderer, which would at least help to tackle the picture quality issue.

Right now, OnLive deserve credit for coming up with a system that "works". In a sense they are trailblazing a whole new way of delivering gameplay. But is it the right way and will being first matter? It's early days for the tech and I'd be hugely intrigued to see the sort of approach to the cloud the major console platform holders choose to adopt in the future…

Read this next

Richard Leadbetter avatar
Richard Leadbetter: Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.
Related topics