Tech Focus: MLAA heads for 360 and PC
How morphological anti-aliasing is transitioning across from PS3 to Xbox 360 and PC
It's difficult to say given that we are running on different platforms and configurations. On PC, we're quite fast. In fact, we're almost free on the mid-high GPU range (around 0.4ms on a GeForce GTX 295) and, as far as we know, the fastest approach given the maximum line length we are able to handle. On the Xbox 360 we run at 2.47ms, with still a lot of possible optimisations to try.
We can't speak for all the MLAA(-like) implementations out there, but we think our current version 1.6 (the one used for these comparisons) has raised the quality bar considerably. In our tests, it produces results on par (when not superior) to CPU MLAA. One of our best features is that we are very conservative with the image: we only process where we are sure there is a perceptible edge and version 1.6 does a pretty good job searching for perceptible edges. This allows preserving the maximum sharpness while still processing all the relevant jaggies.
We can't speak for all the MLAA implementations out there, but we think our current version has raised the quality bar significantly. In our tests, it produces results on par (when not superior) to CPU MLAA.
Common to all anti-aliasing filters out there, if you are working at final display resolution (1x), pixel-popping is going to happen sooner or later. You can try attenuating or eliminating "spurious" pixels, but we think that this is not the optimum solution since it doesn't tackle the root of the problem: sub-sampling. As we said before, we're really conservative with the image, so we avoid introducing additional steps that don’t always work, and so they can affect negatively temporal coherence.
We think the next step involve hybrid approaches combining MSAA (with low sample counts) with filter-based anti-aliasing techniques. This would allow having a good trade-off between subpixel-features, smooth gradients and low processing time.
This is a really interesting question. It's not that it hides MLAA artifacts, but it's the higher resolutions that hide aliasing due to a higher sampling of the scene. At a sufficient resolution, anti-aliasing would not be required: your eyes would do it for you. Instead of discerning each pixel separately, the pixels would be so small that the visual system would average groups of them, yielding the same result as if anti-aliasing was applied.
To demonstrate that, as we cannot make pixels of our monitors smaller, what we will do is to walk away from it. Take a look at this image; on the left you have the perfect anti-aliased image, and on the right the same image at increased resolution, but in this case without anti-aliasing. If you see the images from a distance you will not be able to discern one from other; this is the same as making the pixels so small that you no longer discern them individually.
On the image of the left, the average is done by the computer, and on the right, by your own eyes. This simple example also explains the usual anti-aliasing process; in the end, we just mimic nature. But note how much you have to walk away from the monitor: the resolution required to eliminate aliasing would need to be so high that it would not be a practical solution.
We think the extensive usage deferred shading seen in AAA games will ensure the continuous evolution of filter-based anti-aliasing approaches. In fact, in the past year we have seen the born of a whole lineup of techniques, which will be covered in our SIGGRAPH 2011 Course "Filtering Approaches for Real-Time Anti-Aliasing", and hopefully it will motivate further research in this direction. We believe the evolution will be to combine the best ideas of each technique, trying to maximise the pros of each technique while minimising the cons.
Furthermore, the most realistic a computer generated image is, the more important it is to have a near perfect anti-aliasing. When you were looking at a low-poly character five years ago, the appearance of the graphics looked rather synthetic. You didn't care about aliasing, as there were bigger graphics problems to look at. However, with current rendering advances, when you look at photorealistic game content, aliasing may reveal that the image is synthetic and not real. So, in the future, as the realism of graphics continues to evolve, the importance of having high-quality anti-aliasing will become more and more important. We won't want the jaggies to destroy the illusion created by a perfectly animated and rendered character, revealing that it is, in truth, just a bunch of vertices smartly put together.