Tech Focus: The Legacy of id Software
Digital Foundry on how the Rage creators shaped today's gaming technologies
"You can choose to design a game around the specs of a high-end PC and make console versions that fail to hit the design point, or design around the specs of the consoles and have a high-end PC provide incremental quality improvements," says id software's John Carmack. "We chose the latter."
Opting for console as the target platform makes perfect commercial sense in the current climate, but it is something of a shame that the heyday of id software in driving adoption of new gaming technology has now finally, conclusively, come to a close.
id's heyday as the driving force of both games hardware and 3D rendering is over, but its legacy lives on
With Wolfenstein 3D and Doom, id drove the PC platform as the home for new advances in software rendering, but it was the development of Quake that had truly changed the world of gaming from a hardware perspective - the concept of games machines based on CPU and GPU working together arguably only took off owing to the work of John Carmack, who recognised the inherent potential of 3D hardware acceleration and exploited it beautifully in custom versions of Quake.
The most popular PC game of its era, Quake initially shipped with a standard software renderer with no inherent 3D acceleration at all. However, months later as 1996 drew to a close, Carmack and id released VQuake - a version of the game enhanced for Rendition's Vérité chipset, made popular via Creative Labs' 3D Blaster PCI product.
This allowed for Quake to be run at higher resolutions with 16-bit colour, anti-aliasing, per-polygon mip-mapping and bilinear filtering - massively enhancing the look of the game. Carmack added additional dynamic lighting effects to the game too, significantly boosting the visual appeal. The age of the CPU/GPU combination had dawned.
However, while Carmack's enthusiasm for 3D acceleration continued to rise, it's believed he grew exasperated with the Rendition's APIs, concentrating his efforts on a new accelerated Quake off-shoot which utilised the power of the emerging OpenGL standard. A special driver was produced which allowed OpenGL to work with the new range of 3Dfx Voodoo Graphics GPUs - and the rest is history.
id's pioneering work shapes hardware design right up to the present day: the combination of CPU and GPU is now a standard in all computers and consoles (from the PS2 era onwards) and pretty much every attempt to deviate from this basic design has met with failure. Intel's Larrabee architecture came to nothing, while Sony's initial scheme for Cell to cover both CPU and GPU roles in PS3 also proved fruitless, resulting in a hasty deal that resulted in an off-the-shelf NVIDIA part being repurposed for inclusion into the console. Even AMD's APUs - all-in-one CPU/GPU packages - still follow the basic archetype established so long ago thanks to the quality of software from forward-looking developers with id at the spearhead.
While it can be argued that the emergence of the graphics card was all but inevitable, it required a catalyst for mass adoption - and only the most popular PC game of the time could provide the required momentum. The die had been cast, and the new paradigm for gaming hardware design had arrived.
Having defined the modern FPS with Wolfenstein and Doom, Quake also introduced a number of new technologies that persist to this day. The support for TCP/IP networking allowed for multiplayer gameplay over the internet based on the client/server model, but the release of QuakeWorld changed everything. Internet play with Quake was great for those with high-speed connections, but back in 1996, barely any one had anything that could be considered equivalent to a modern broadband connection. The client/server architecture meant that every input from the player was beamed to the host, only registering when the server sent back its response. As Quake would be operating at anything from 300ms to 500ms via a dial-up connection, the impact to gameplay was substantial.
Carmack's next great gift to the gaming world was client-side prediction - a concept that is used in virtually all action games that support multiplayer. The idea here is that the client - the player-side code, essentially - doesn't need to wait for feedback from the server in order to move, or shoot, or animate shots or rockets. Instead the code operates semi-autonomously: once the client receives data on the location of a bullet, its speed and trajectory, it doesn't need to have constant updates from the server on where it's heading - it can be mathematically calculated by the client giving a real-time response independent from the host.
The accuracy of client-side prediction increases as latency to the host decreases and since the widespread take-up of ADSL (which, in the UK at least became a mainstream proposition three years post-Quake in 1999), the technology has helped facilitate an almost seamless performance level for services like Xbox LIVE and PlayStation Network and the pioneering work of id software is acknowledged by many, not least its competitors Epic Games in its current Unreal Engine development kit.