The Alpha to Beta Push – Condemned 2
Senior Producer Dave Hasle, Monolith Productions
I work with a Primary Lead Artist, Matthew Allen, who is pretty much a hardcore fire-fighter for anything content-related. An issue pops up - technical, pipeline, aesthetic, whatever - Matt will jump on it and put it out quickly and efficiently. As a Producer, I don't close out games at Monolith unless I've got Matt Allen helping keep it all under control and fitting in memory.
One of our issues was with the production schedule of cinematics. We had a certain quality bar that we wanted to hit but the schedule wasn't allowing for it to happen. In stepped Matt to establish the pipeline and help render out the final frames. Of course, our cinematics could not have been done without Rocky Newton, our cinematic director, and Nick Kondo, cinematic guru, being as dedicated as they are as well.
Matthew Allen, Primary Lead Artist: My job has covered a whole range of things, but my major focus over the last three months of the project has been getting the final cut-scenes into the game.
One of the things we identified as needing a lot of work at the end of the first Condemned game was the cut-scenes. We had spent a good amount of time developing a new process to get them into the game with consistent movement and camera animation, all controlled by the animators. However, as it was our first real console title we ran into a couple of issues; the biggest two being memory and frame-rate. As we were working with a limited amount of memory, and the cut-scenes needed to be rendered real time in the same streaming region as the rest of the game, we were forced to cut back on both texture resolution and animation fidelity. This led to some of the camera cuts and positions, which the animators choose based on how well the models looked in Maya, to look very pixilated and 'last gen'. Frame-rate also became an issue. Since all of our lighting and shadows are dynamic, being generated real-time, our frame rate is greatly affected by the amount of lights we have in a scene. So, we were severely limited by that factor also.
Early in production on Condemned 2, our cinematic director, Rocky Newton, and I sat down and brainstormed possible solutions to these issues. For the greatest flexibility form an animation standpoint we decided that we were going to pre-render all of our cut-scenes in Maya, using Mental Ray. This solved a number of our original issues, but created a few more - plus, for a number of folks it was disappointing. We all know that our engine can do some pretty amazing things, so there was some push for us to figure out a way to get the cinematics back in game.
Regardless of the final "how" of the rendering, Rocky Newton was able to move forward with storyboards and animatics. He used a very early draft of the script, along with pickup voice acting from the Yeti team and various Monolith employees. This was the first time we had ever done full animatics and storyboards for our game cinematics, and Rocky's unique drawing style was perfectly suited for the task. Very early in production the team was able to see a number of very high quality timed animations, and the excitement level was high. At the same time, we began exploring mental ray and the pipeline for moving things back and forth between the game and mental ray. While we did get some excellent results, it was a very cumbersome process and both the mental ray rendering time and turnaround time for game assets became problematic.
Around the same time, a number of folks seemed to have a "Eureka" moment. So why not use the engine to offline render the cinematics? This way we don't have to worry about frame-rate, so we can add as many lights and as much shadow casting geometry as we want and we also don't have to worry about memory, since we were rendering it on a beefy PC. The memory stuff allowed us to up the resolution on all of the textures so even close-ups on people's eyes could look good. Our Lead Engineer Brian Legge worked with John O'Rorke, our Principal Software Engineer, Engine Architect, and very quickly implemented this for us and the results were very strong.
Since we were now using the engine to render the cinematics, I could now use the latest in game art assets for rendering, including effects. This freed up a lot of time in our schedule which allowed us to focus on some of the cooler animation stuff. Most of my time could now be spent on lighting and tweaking effects rather than moving things into mental ray and trying to replicate our in-game shaders. We could also now do a bunch of stuff that the engine can handle, but modern consoles can't - like turning up the real-time velocity based motion blur, and running with 30 to 40 lights per scene. We can also use a ton of our in-game special effects together: things like depth of field, full screen color mapping, film grain, sharpen, and a whole host of others, that the console aren't powerful enough to use together in real-time.
There is still a number of workflow issues that we will need to solve for next time, but for the most part our experiment has been a resounding success and has really allowed us to produce some of the highest quality in game cut-scenes I have personally ever seen. It has been very exciting to watch this whole process unfold, and to bring to life all of Rocky's initial storyboards, while being able to highlight the great amount of talent of both our artists and our engineers here at Monolith.