Light, not bytes: The evolution of the pipeline
Development infrastructure is changing. And it starts with pushing less data, not more, suggests Escape Technology
Over in the consumer space, the notion of stuffing a hard drive with files has become rather antiquated.
By default, music, movies and games are streamed in many homes, while lay-software is increasingly bearing the burden of its gradual superseding by offerings like Google Docs.
And yet, at many different games developers where a flag is confidently staked in the cutting edge, development pipelines are still housed in on-premise hardware, stitched together from middleware, software and other technologies. They thunder away, handling files and assets while pushing data to workstations, at a time where even the once-futuristic gigabyte is now seen as trivial volume of data.
"For the incoming generation of pipelines, we are seeing a shift from moving bytes to moving light," says Lee Danskin, Chief Technology Officer at Escape Technology, a reseller that, among other things, helps studios develop and build pipeline infrastructure.
"We measure data in gigabytes and not megabytes these days, and have done for some time," Danskin continues. "If you look at networking over the last 30 years, every ten years we've moved from one [megabyte] to ten to 100, and then to one gig, and now we're at the end of the one gig cycle."
That brings us to a current fork in the path of game technology evolution, where developers must decide which way to turn with regards to handling data in pipelines. Where once memory creativity in coding - squeezing all one could from a single kilobyte - dominated studios' attention, today two approaches to the data challenge co-exist.
"We're now at a point where we can go with 10 gig as the standard, but there'll need to be a lot of performance there, and a lot of work updating pipeline infrastructure for a building itself to accept it," Danskin offers. "Equally, the performance of individual machines is starting to far exceed what you can do across most networks. With SSD and NVMe drives now, the core performance you can get from a single machine is almost as fast as you really want it to be, at least within a local machine.
"We've got machines that outperform the networks connecting them. It's like game developers have Ferraris under their desks, with a bit of string coming out the back"
"So we've got machines that outperform the networks connecting them. It's like game developers have Ferraris under their desks, with a bit of string coming out the back. That 'string' is a Cat5 or Cat6 cable, and it's just not enough these days."
The alternative option is forgoing building pipelines that can handle contemporary data volumes and computing power, and instead looking at the challenge from an entirely distinct perspective.
"Here we all started talking about how little we can move through a network, rather than how much a pipeline can handle," reveals Danskin, highlighting a logical inversion of the history of pushing pipeline capacity. "That means moving pixels to screens, instead moving of bytes, simply put."
That is increasingly possible thanks to the rise and rise of optically-based fibre connections. But what of how such a networked pipeline would exist in a hypothetical large scale, single-location game studio?
"Using a data centre, or a rack or machine room on premise, you'll essentially have desks with just a screen, keyboard and mouse, with pixels - so to speak - being sent to them from that hub. And the logical extension to that is that what is possible from a data centre is possible from the cloud."
That puts high-end, powerful networked pipelines in the reach of small, scattered and satellite teams, as well as the world's largest outfits. And it's already happening on a near daily basis, says Danskin.
As this new generation of pipeline emerges into game development's technological mainstream, rhe CTO is convinced that the most significant factor with regard to whether devs go with the cloud or a local rack will be the simple matter of how different studios feel about data security and ownership. Cost will be a factor for some too, as with almost any emerging technology, of course.
Significantly, the concept of such pipelines is that it is almost irrelevant where networked modern solutions that move light between screens are actually housed, so the on-premise or off-site decision is left entirely to studios' preferences. There are some strengths and weakness with regard to the different options, but it almost doesn't matter whether you are connecting remotely to a data centre.
Danskin's vision for a new standard in pipelines isn't limited to the way they handle data, however. Over time, he sees that what pipelines actually are - and what they handle - will morph and adapt.
"The computing world is almost split in two. There are the cloud guys, and the traditional IT mindset. That division in thinking has got to change"
"The whole file-based workflow may become less important in time," the CTO suggests. "How we think about block and object storage might be set to change in that way. At the moment the computing world is almost split in two. There are the cloud guys with their big data sets and everything else, and then there is the traditional IT mindset of working file-based on premise. That distinction or division in thinking has got to change."
Danskin predicts a rise of software that is object and block aware in terms of storage, as well as working intelligently directly within the cloud.
"That's the future, and we'll get there," Danskin asserts.
As with any advancing or emerging game development technology, only one thing really matters. If new pipeline technology doesn't make a difference to the way and speed with which games are created, then for studios the value of an infrastructural rebuild is somewhat moot. There may, in time, be a cost saving advantage to switching from bytes to pixels, but the economies of scale involved are yet to make such solutions one for thinly stretched budgets.
"There's all kinds of advantages that this approach brings about, though," Danskin states. "You can start a new studio without having to buy lots of equipment to get up and running. You can run things straight out of the cloud too. For people working from home, or working remotely, dialling up compute if you're ever baking textures - or compiling multiple versions of the same game - spinning up nodes in the cloud just gives you more options. Render farms are more available to people doing any sort of animation work now, and that allows them to have a dynamically scalable system. Suddenly, studios aren't limited by the physical equipment they have."
Of course, developers of every kind are already enjoying the benefits of collaborative creativity through cloud-based offerings. From the various elements of Microsoft's integrated development environment Visual Studio to the Amazon Web Services cloud-computing offering, there are many ways to harness that potential.
A true, complete pipeline in the cloud or on a single rack, however, is emerging as a quietly revolutionary advance, at least in terms of the way developers work, and how studios are structured.
"The days of developing at a desktop PC exclusively are soon to be long gone. People want to be able to move between laptops and tablets and even hybrid 'phablet' phones"
"The days of developing at a desktop PC exclusively are soon to be long gone," Danskin predicts. "People want to be able to move between laptops and tablets and even hybrid 'phablet' phones. You could even say the laptop has started to lose out already. But equally, today developers want compute power, and they want the storage and the data. Still, they don't always want to carry a PC everywhere they go."
If that vision of true, complete, powerful pipelines that can be accessed through any combination of screen and inputs becomes a reality, the idea of satellite studio structuring may only be the beginning of games development's unshackling from physical infrastructure, location and workstation power.
But Danskin accepts there are challenges to the concept. Some critics offer vocal concerns around the idea of internet dropouts impeding workflow, but there the CTO suggests having a robust internet connection is today akin to having reliable electric supply for any serious studio. Routers inevitably go down more than mains power. To criticise a technology for failures in an improving base technology is perhaps pragmatic, but may not demonstrate the most progressively minded approach.
"The biggest Achilles' heel, though, is the existing issue of data security around these pipelines"
"The biggest Achilles' heel with it, though, is the existing issue of data security around these pipelines," Danskin recognises. "Some developers will always think about the trust value of their data being in somebody else's hands, or in somebody else's control. That's the biggest element of risk or worry or whatever it might be. Not having all your eggs in one basket isn't for everyone."
Many, though, are likely to find that the benefits outweigh that issue of trust. The approach is flexible too: vital if it is to meet the needs of the wild diversity of developers that exist today. An on-premises arrangement will most probably be the best option for smaller or prototyping teams, while a hybrid of local servers and the cloud will better serve larger teams.
"And we have what we call the 'born in the cloud' teams who really don't want any hardware, and want their work to exist in the cloud in all its glory," Danskin confirms.
Those type of pipelines are starting to emerge as a complete solution for some teams already, though Danskin and his colleagues are more likely to develop and implement on-premises pipelines of the type that put light before bytes in the immediate future.
"These kind of pipelines are becoming everyday for us at Escape Technology's already. This isn't something coming in the future. It's today, and it's becoming the norm, and what is possible here is accelerating all the time."
It might be that you are already working through an on-premises pipeline of this kind. If you are, that is looking to be but a taster of the future.
And for those still sat at a desk with hardware that has its own computing power, contemporary desk spaces may soon be something of a nostalgic concept.
Pipelines aren't going anywhere for a long time yet, but the form they take is undergoing a tantalising change.