Epic Shows Samaritan Demo Running On Next-Generation NVIDIA ‘Kepler’ GPU
Featured Stories, Videos
Click here to enlarge our zoomed 4x MSAA versus FXAA 3 comparison, and here for a full-scene 1280x720 comparison.
Click here to enlarge a third interactive comparison showing Samaritan without Anti-Aliasing, and with FXAA 3 Anti-Aliasing.
March 7th, 2012
By Andrew Burnes
Last year at the Game Developers Conference, Epic Games unveiled the jaw-dropping Samaritan demo, a look at the next generation of videogame graphics. Created in a dark, futuristic setting, the demo utilized a host of advanced rendering techniques that smoothly tessellated and morphed facial features, created realistic street scenes using point light reflections, and replicated the work of the best movie directors through the use of fine-tuned out of focus bokeh filters. The only downside was that it took three GeForce GTX 580s to run the demo in real-time.
Today GDC 2012 is upon us, and once again Epic has shown the Samaritan demo, but this time with a twist - instead of three GeForce GTX 580s, the demo was shown running on a single next generation NVIDIA graphics card.
At the conclusion of the demo the card in question was revealed to be our much-anticipated ‘Kepler’ GPU, the follow-up to our current-generation Fermi card. Although no further information on Kepler was given, the demo sent a clear message: the graphics in Samaritan, generally regarded as a glimpse of the gaming industry’s far-off future, will in fact be possible in the near future on PC systems running a single next-generation graphics card.
But it wasn't just the Kepler graphics card that made the demo possible. As important was the addition of Fast Approximate Anti-Aliasing (FXAA), an NVIDIA-developed anti-aliasing technique aiming to improve upon the established success of Multisample Anti-Aliasing (MSAA), the form of anti-aliasing most commonly seen in today’s games. Used to smooth out jagged edges and improve visual fidelity, anti-aliasing is key to creating the incredible sights of Samaritan.
Despite MSAA's popularity, "it is relatively costly in the demo because Samaritan uses deferred shading," explains Ignacio Llamas, a Senior Research Scientist at NVIDIA who worked with Epic on the FXAA implementation. By writing pixel attributes to off-screen render targets prior to final shading, deferred shading enables complex, realistic lighting effects that would be otherwise impossible using forward rendering, a lighting technique commonly used in many game engines. There are a couple of downsides to this: first the render targets require four times the memory since they must hold the information of four samples per pixel; and second, the deferred shading work is also multiplied by four in areas with numerous pieces of intersecting geometry.
"Without anti-aliasing, Samaritan’s lighting pass uses about 120MB of GPU memory. Enabling 4x MSAA consumes close to 500MB, or a third of what's available on the GTX 580. This increased memory pressure makes it more challenging to fit the demo’s highly detailed textures into the GPU’s available VRAM, and led to increased paging and GPU memory thrashing, which can sometimes decrease framerates.”
"FXAA is a shader-based anti-aliasing technique,” however, and as such “doesn't require additional memory so it's much more performance friendly for deferred renderers such as Samaritan." By freeing up this additional memory developers will have the option of reinvesting it in additional textures or other niceties, increasing graphical fidelity even further.
FXAA can also produce gradients smoother than those of 4x MSAA, as is particularly evident in the Samaritan demo, shown below in our interactive comparison images contrasting the anti-aliasing quality of MSAA and FXAA 3.
As you can see, FXAA is noticeably smoother, eliminating the pronounced jagged edges that are especially visible when placed against the scene’s background lighting.
If you’d like to learn more about Samaritan, Unreal Engine 3, or any other aspect of the demo, check out our in-depth Samaritan deep dive from May 2011.
For more info on our next-generation Kepler graphics card, stay tuned to GeForce.com.