Skip to main content

Tech Focus: Motorstorm Apocalypse

From first impressions of the PS3 to stereoscopic 3D and 1080p

Digital FoundryMotorstorm has traditionally been quite remarkable in terms of the amount of vehicles you run, along with their diversity. How do you render so many vehicles at any one time? Are dynamic LODs (levels of detail) in play?
Andy Seymour

Yes. We use LODs based on average polygon size to screen area. We give the artists bias controls to fine tune the results. We also use shader LODs to help further, rejecting shader code that isn't required at distance.

Oli Wright

Drawing lots of tiny polygons is really inefficient. Our LOD algorithm looks at the average polygon size in a mesh, and then looks at how big that average polygon would be when projected onto the screen. It then chooses an appropriate LOD based on that. It does a pretty good job overall and largely automates the process of determining when LOD switching should take place. It doesn't always work though, so the artists always have the option to override it.

We had to dig much deeper to find solutions to help improve frame rate especially on some of the animated event and VFX hungry tracks.

Neil Massam, vehicle lead, Evolution Studios
Neil Massam

Our vehicles have to adhere to very strict polygon, texture and memory budgets which are set at different levels for each vehicle class. The vehicles must also pass stringent validation checks prior to export to ensure that they are free from any technical defects and are compliant with the game and its enforced budgets.

LOD meshes are manually created for all vehicles and used in conjunction with material LODs which we are able to dynamically adjust and use to balance the frame rate in game depending on distance or vehicle usage.

LOD bias values are also assigned to various part groups of the vehicles to force them to switch LODs earlier or later depending on their relative size, ie. technical components, bodywork etc. The vehicles also use a shared pool of textures, and library of preset materials which help us to manage and reduce the overall memory footprint of a typical vehicle.

We had to dig much deeper on MotorStorm Apocalypse due to the performance demands of the game and were forced to find solutions to help improve frame rate especially on some of the animated event and VFX hungry tracks.

We decided to introduce occlusion shapes on the vehicles for the first time which were used to occlude components within the vehicle that would normally be hidden from view but still rendered i.e. engine, suspension components etc. Vehicle shadow proxy models (lower poly meshes) were also created to reduce the cost of rendering real-time shadows.

Finally we had in-game GUI performance statistics to help us quickly assess the GPU and CPU cost of the vehicles or worlds that were proving expensive to render, which helped us to quickly identify and fix any problems.

Digital FoundryWeather effects are a new addition to MotorStorm. What did you set out to achieve with this, how is gameplay affected by changing conditions and what were the technical challenges involved?
Matt Southern

We had a lot of requests to add weather effects, and decided to focus them on a couple of tracks to emphasise the effects. Gameplay-wise they add a force to the vehicles, albeit very subtle (we found it could be really frustrating), and they also mean the engine continually cools - which in gameplay terms means you almost constantly have the ability to boost. This means the tracks are driven at notable speed and increase the sense of mania.

Andy Seymour

Our weather effects are highly mood affecting and visual. We set out to achieve ferocity in our weather effects to truly immerse the gamer. Technically this was challenging because we were already maxing out the GPU, so we had to work on shader tricks to give the impression of storms.

Digital FoundryLet's talk for a moment about physics - we understand you worked with Havok pretty closely in moving physics across to the SPUs with Pacific Rift. Did this tech remain constant or was it improved still further for Apocalypse?
Dave Kirk

Yes, we've worked closely with Havok since the first MotorStorm. On Pacific Rift there were significant performance gains over previous versions thanks mainly to increased SPU usage, so by the time we were working on Apocalypse the big performance gains had already been made, however the lessons we learned from the early products enabled us to really streamline our code. This meant we could squeeze even more tech in, such as our groups of NPCs, as well as significantly ramping up the amount of dynamic objects and destruction.

Digital FoundryThere have been several different approaches to HDR lighting ranging from logluv to RGBM - do you operate with HDR in Apocalypse, and if so, what format did you go with?
Oli Wright

The exposure value for the frame is pushed forwards from the results of the previous frame into our material rendering pass, so the 0 to 1 range of values that we get out of that pass is 'post exposure'. Our material pass outputs two LDR render targets though. The first is the regular 0 to 1 range. The second is the same output divided by 32. So the second render target covers a much wider range of light intensities. This buffer isn't seen directly, but it's used as the key for our bloom, and it's used to calculate the exposure value for the next frame.

We know how common it is to accuse 3D of being a gimmick, and when you make games you have the golden opportunity to disprove this by affecting gameplay.

Matt Southern, game director, Evolution Studios

This approach has the same bandwidth requirements as using a single FP16 render target when we're laying down the buffer. This isn't generally a bottleneck for us, so that's not a problem. The benefits from not being FP16 come from when we use the buffers as textures. The post processing doesn't care about things outside the 0 to 1 luminosity range, so it reads from the first buffer. The bloom and exposure metering don't care about high detail dynamic range, but they want a wide dynamic range - so they read from the second buffer. So both users of the rendered image get the information they need, but at 32-bit LDR bandwidth cost.

Digital FoundryNot much is known about the basic set-up of your renderer - are you using a traditional forward renderer or have you adopted a deferred approach as seen in the likes of Uncharted 2 and the Killzone games?
Oli Wright

MotorStorm and Pacific Rift were traditional forward renderers. Apocalypse is a semi-deferred light pre-pass renderer. We first render the normals, then we accumulate lighting into an FP16 buffer, then we do a final 'material' pass to produce the image that then goes off for post processing.

It's a bit difficult to show 3D in a 2D website, but here's some '2D plus depth' - left eye view to the left and an anaglyph conversion of the full 3D screenshot on the right. Click through for the full-screen images and ready your paper glasses!
Digital FoundryEvolution Studios has been at the forefront of supporting stereoscopic 3D and the team has often talked about the best results being attained by building the engine up around 3D capabilities - at the most basic technical level, how did you go about integrating 3D into Motorstorm Apocalypse?
Oli Wright

Stereoscopic 3D renderers can be broadly classified as either 'reprojection' or 'draw everything twice'. In Apocalypse we 'draw everything twice'. Nearly everything anyway. We try to share as much processing as we can between eyes. For CPU that's largely the scene processing and object culling phases. For the RSX it's essentially our shadow map rendering.

One area that we try to be very careful with is dynamically adjusting the interaxial to prevent window violations. Also you won't find any frame tearing when Apocalypse is running in 3D. Those two things are incredibly important for having a 3D experience that is comfortable and easy to view.

Digital FoundryFeedback to the 3D experience in Apocalypse has been universally positive - what did you want the player to get out of the game?
Matt Southern

We know how common it is to accuse 3D of being a gimmick, and when you make games you have the golden opportunity to disprove this by affecting gameplay, not just visuals which of course are all a movie or broadcast can offer. Aside from the obvious fact that our concept proposition lends itself beautifully to forward-rushing spectacle, we tried to make the 3D contribute to the sense of vertigo, and the subconscious ability to judge the player vehicle in relation to the AI, tracks and obstacles. To essentially make judging a racing line more instinctive and have 3D contribute to the vital sensation of 'flow'.

Digital FoundryThere's a lot of discussion about the technological issues in incorporating 3D, but art direction has just as much of a part to play - how did you approach this aspect of MotorStorm Apocalypse?
Simon O'Brien

As we implemented the 3D, we began to maximise the elements that we discovered added most impact to the stereoscopic experience. For instance, small particles such as fire embers and airborne debris were suddenly the heroes of the scene, especially when seen in some density, allowing them to be perceived with a new volume and depth beyond the 2D version.

We also needed to consider what methods we would need to employ to avoid the pitfalls of 3D such as negative parallax and frame violations, whilst taking the 3D effect to its limit. Mostly this came down to tuning of camera parameters and how best to frame the action effectively to sidestep these issues. As a result of taking time over every vehicle camera, scene camera and even the frontend GUI, we feel that it really is one of the strongest examples of stereoscopic presentation available today.

Richard Leadbetter avatar
Richard Leadbetter: Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.