Updated: CryEngine 3 to lack Anti-Aliasing ? Maybe
So there I was, spending my peaceful evening wandering around the internet when I came across an interview with Carl Jones from Crytek about the upcoming CryENGINE 3 which will power the next generation Crysis games on PC, Playstation 3 and Xbox 360. Like any other PC enthusiast, I am very excited about the prospect of a new Crysis game with even better graphics than the first one to bring current and future generation of hardware to its knees. So I promptly jumped over to PC Play, reading through the interview I came across a very interesting answer.
When asked about what new technologies were implemented in CryENGINE 3, Mr Jones answered:
We have implemented more multi-core support, streaming, a new rendering technique: Deferred Lighting, which allows for a huge number of dynamic lights in a scene, without requiring a high amount of processing power, we have real-time global illumination in the engine, plus a host of smaller, yet important improvements to all our technologies and features.
Pay close attention to the bolded text above. As soon as I read that, BAAM. It hit me. I thought to myself, wasn’t this “Deferred Lighting or Shading” the reason behind lack of Anti-Aliasing (almost expected from every PC game these days) in S.T.A.L.K.E.R. : Shadow of Chernobyl and Dead Space. Now I’m no game coder, so I don’t know what this technique is or how it works. So I did what a normal end-user would do when things don’t work like expected. I searched on Wikipedia first about this technique, and then searched on Google for something concrete from maybe a game developer on why Anti-Aliasing is not supported with “Deferred Lighting”.
Here is what Wikipedia has to say:
In computer graphics, deferred shading is a three dimensional shading technique in which the result of a shading algorithm is calculated by dividing it into smaller parts that are written to intermediate buffer storage to be combined later, instead of immediately writing the shader result to the color framebuffer. Implementations on modern hardware tend to use multiple render targets (MRT) to avoid redundant vertex transformations. Usually once all the needed buffers are built they are then read (usually as input textures) from a shading algorithm (for example a lighting equation) and combined to produce the final result. In this way the computation and memory bandwidth required to shade a scene is reduced to those visible portions, thereby reducing the shaded depth complexity.
Don’t worry if you are overwhelmed by all the technicality above, pay close attention to the quote below as I had confirmed my suspicion. Wikipedia states:
Another rather important disadvantage is that, due to separating the lighting stage from the geometric stage, hardware anti alias does not produce correct results any more: although the first pass used when rendering the basic properties (diffuse, normal etc.) can use anti alias, it’s not until full lighting has been applied that anti alias is needed. One of the usual techniques to overcome this limitation is using edge detection on the final image and then applying blur over the edges.
This means that when using “Deferred Shading”, traditional hardware assisted Anti-Aliasing does not work. However as a workaround to this, a blur is applied to the jagged edges in the image.
Next piece of evidence comes from a book published by Nvidia on their developer site titled “GPU Gems 2 : Programming Techniques for High-Performance Graphics and General-Purpose Computation”. Oles Shishkovtsov from GSC Game World in “Chapter 9. Deferred Shading in S.T.A.L.K.E.R.” of this book explains
A deferred renderer is just incompatible with current hardware-assisted antialiasing, unfortunately (Hargreaves and Harris 2004). Thus, antialiasing becomes solely the responsibility of the application and the shader; we cannot rely on the GPU alone. Because aliasing itself arises from the mismatched frequencies of the source signal and of the destination discrete representation, a good approximation of an antialiasing filter is just a low-pass filter, which is simple blurring.
Along with S.T.A.L.K.E.R., Dead Space is another game that uses “Deferred Shading.” So I loaded up Dead Space to see how much blur is added to the edges, and sure enough there was the difference right in front of me. The Anti-Aliasing option in the advanced visual settings in Dead Space turns on this edgeAA blur method of Anti-Aliasing. Below are the 100% crop PNG images from the screens. The game was running at 1920×1200 with maximum possible settings.
Images with in-game Anti-Aliasing option enabled are on the left.
Notice how blurry the face looks. Also notice the blurry text “Schofield Tools” in the background.
The floor looks blurred in the above comparison, although everything else looks almost the same.
Notice the blurry text in the background.
Although the difference in game play does not jump out at you, but it is kind of blasphemous to think that the next Crysis, of all games, might not support a technique such as Anti-Aliasing. I know how inflamed the PC gamers get over the word “port” but Mr Jones explains:
With CryENGINE® 3 – you don’t really port to each supported platform – you build your game to run on CryENGINE® 3 and it will run on PS®3, Xbox 360™ and PC. Your lead platform is now simply a matter of design – not a requirement to favour one development approach over another due to technical limitations.
Hopefully, PC gamers will get a game with tailored to take advantage of all the latest hardware from Intel, AMD and Nvidia at the time of release, even though there might be a certain blurring around the edges.
Update: Digging around some more, it appears that Anti-Aliasing can be enabled with “Deferred Shading” in DX10 mode. Nvidia published a tech paper titled “DirectX 10: The Next Generation Graphics API” which explains:
Access to Multisampling Subsamples The current method of resolving multisample antialiasing (MSAA) is done at a very late stage in the graphics pipeline called scan out. At this stage, the subsamples cannot
be recorded in memory for subsequent use. This limitation means MSAA cannot be used in deferred shading graphics engines. In DirectX 10 it is possible to bind an MSAA render target as a texture, where each
subsample can be accessed individually. This gives the programmer much better control over how subsamples are used. Deferred rendering can now benefit from MSAA. Custom resolves can also be computed.
So you would think that all is fine and dandy now, and the upcoming Crysis would feature Anti-Aliasing. Well, I thought so too, until I came across this
The above slide is directly from Crytek, talking about the design goals of CryENGINE 3. So it looks like they might be using EdgeAA (the blurry method of removing jaggies) in DirectX9 and they end with “….we work on it”. I can only assume that they might implement Anti-Aliasing in DX10 mode.
Supporting Evidence:
Wikipedia link to Deferred Shading
Nvidia “DirectX 10: The Next Generation Graphics API” (PDF Download)
DX10 and higher allows anti-aliasing with deferred rendering, and DX10.1/DX11 makes this process faster and more efficient.
While it’s up to the developer to do this, ATi and nVidia often implement driver-level solutions themselves. They also do this for many DX9 titles, such as UT3 based games.
nVidia generally leads with driver AA compatibility (e.g. they can AA Stalker games), which is one of the main reasons why I stick with them.
In theory ATi’s edge detect CFAA should work with any deferred renderer because it’s a shader based post-filter, but sadly that hasn’t been the reality thus-far.
You don’t even need DX10 necessarily, you just need access to the individual MSAA subsamples. You can do this on consoles since they don’t use a PC graphics API, which is why games like Killzone 2 and Uncharted 2 use MSAA with a deferred rendering approach.
CE3 will use deferred lighting, not deferred shading – read carefully, please. Deferred lighting, also called light-prepass rendering, or pre-lighting, sits somewhere between full deferred shading and the normal forward rendering, with one of its major benefits the ability to use hardware-accelerated MSAA for antialiasing.
They will simply call DoFancySimulatedAA() in the final game ;-). Seriously, I’m not worried, nor do I care if my cpu has to do some of the work. I mean, I have an i7 at 3.8ghz and it’s cores are usually sleeping. So if it has to do some AA or blur stuff I don’t mind. And I get a new Nvidia GTX 470 and must say it’s awesome! And all that b.s. about it being hot and noisy etc is just that, b.s.. So I want a new game to fully use my machine and look the best it can. I can run Crysis max everything and it doesn’t even flinch at 30 to 60fps. So hopefully this new game will bring it down to maybe 18 to 22 fps and be full of detail. I wouldn’t want lower than 10fps though…