The Gamer's Graphics & Display Settings Guide
[Page 11] Graphics Settings - Antialiasing & Anisotropic Filtering
One of the major problems of computer graphics is that any curved or angular lines exhibit a lot of jaggedness, almost like the steps on a set of stairs. These jagged lines can also shimmer slightly when animated, which makes the problem even more annoying. This jaggedness in 3D graphics is often referred to by gamers simply as 'jaggies', but the effect is formally known as Aliasing. The reason aliasing occurs is because the image on your screen is only a pixellated sample of the original 3D information your graphics card has calculated. At increasingly higher resolutions, as more pixels are used, that sample comes closer to the original source and hence the image is clearer and displays less aliasing. However aside from the fact that higher resolutions can degrade performance, many monitors simply cannot display the very high resolutions needed to effectively remove the problem.
This is where a technique originally called Full Scene Anti-Aliasing (FSAA), and now simply Antialiasing (AA), can be used by your graphics card to make the jagged lines in computer graphics appear smoother without having to increase your resolution. It essentially resamples and blends jagged lines with their surroundings, and can be applied to either 2D or 3D graphics. A screenshot comparison is shown below, demonstrating how AA can work to reduce the jaggedness of lines:
You can also view an animated game screenshot comparison of Antialiasing in action by clicking this link: Antialiasing.gif (763KB).
As always, the obvious question that comes to mind is if antialiasing solves jaggedness, then why don't games come with 'built-in' antialiasing already enabled? The reason is that antialiasing is a graphically intensive task which can use up great amounts of VRAM and GPU calculations depending on just how smooth you want the image to look, and this can dramatically reduce performance. That's why virtually every game, as well as your graphics card's control panel, give you a range of different types and levels of antialiasing you may wish to enable. Or indeed you may choose to disable it altogether for maximum performance.
The first choice you have to make if you want to enable AA is the sample rate, typically expressed as 2x, 4x, 8x etc. This tells your graphics hardware how many pixel samples to take around the area to antialias - the higher the number, the more pixel samples used to blend the jagged lines, and hence the smoother the image will appear at the cost of greater processing power and hence lower performance.
Importantly, since your resolution can also affect how many pixel samples are already being taken, and hence how smooth your graphics appear in general, you can experiment with altering both your resolution and AA level to see which provides the best combination of performance and image quality. For example 8x AA is typically sufficient at 1280x720, while you may only need 2x AA to achieve the same effect at 1920x1200, and your overall performance may be better as well. There is no hard and fast rule as to which performs or looks best, it depends on your particular hardware combination, monitor limitations and taste. Remember that on LCD monitors, when the display is not at its native resolution, a crude form of antialiasing with no performance impact is already in effect because your graphics card or monitor has to digitally rescale the image and in doing so has to recalculate and blend pixels - see the end of the Resolution section of this guide for more details.
There are different methods of undertaking antialiasing, and although I can't cover them in full detail here, we can take a quick look at what they are and what they basically do.
The most common form of AA a few years ago was Supersampling which is a brute force method that pretty much just increases the resolution of the entire image within the graphics card to remove jaggedness, then rescales it back down to display at your chosen resolution. This method reduces performance a great deal, but works with almost any game.
Multisampling (MSAA) is a newer form of AA which came about through incorporating optimizations into graphics hardware to perform more efficient AA. It can still reduce performance, especially at higher resolutions, but less so on more recent graphics cards. MSAA provides a good compromise between image quality and performance.
Quincunx, Transparency, and Gamma Correct Antialiasing are variations of antialiasing methods used by Nvidia graphics cards and are covered on this page of my Nvidia Forceware Tweak Guide.
Temporal and Adaptive Antialiasing are variations of antialiasing methods used by ATI graphics cards and are covered on this page of my ATI Catalyst Tweak Guide.
From the GeForce 8 series onwards Nvidia introduced Coverage Sampling Anti-Aliasing (CSAA) which can produce better results than the standard Multisampling (MSAA) approach, at around the same performance as MSAA. Similarly, ATI introduced Custom Filter Anti-Aliasing (CFAA) as of the HD 2000 series which can also provide better results than standard MSAA depending on the filter used, but may blur the image slightly.
More recently as of the HD 6000 series, ATI has introduced Morphological Anti-Aliasing (MLAA) which uses post-processing to apply a form of full-scene anti-aliasing to any game, with performance similar to edge-detect CFAA but with better results. Nvidia has also introduced Fast Approximate Anti-aliasing (FXAA), which is similar to MLAA. In both cases, the main drawback is that they can introduce some blurring to the scene. A recent form of MLAA that does not have blurring, but performs quite well is known as Subpixel Morphological Anti-Aliasing, or SMAA.
Finally, the most recent development is the introduction of a form of Temporal Anti-Aliasing by Nvidia known as TXAA. TXAA provides a result that is similar to a blend of MSAA and FXAA; that is, a good reduction in jaggedness similar to MSAA, but with some blurring similar to FXAA. The key benefit of temporal AA is that it reduces the shimmering and crawling of aliased edges while a scene is in motion.
For an example of the visual and performance impact of the various basic levels of AA in a recent game, see this Crysis 3 Antialiasing page.
The key point is that on modern graphics cards, antialiasing can be achieved at a reasonable performance cost because they have been designed to incorporate these techniques into their hardware. The older the graphics card, and/or the less VRAM it has, the more likely that it will suffer a larger performance loss from enabling AA. In the end whether you enable any antialiasing is up to you, and the particular mode you use should be the result of some experimentation on your part to see what looks best in your particular circumstances.
In the Antialiasing setting, we saw that AA is a useful method for reducing the jaggedness of lines in graphics, and in particular in games. However AA will not resolve another graphical glitch common in 3D graphics - namely blurry surfaces on more distant objects. Textures are 2D images which have been placed onto the surface of 3D objects in a game to make them look realistic. They can appear at all sorts of angles and distances, depending of course on which part of the game world they've been applied to. A brick patterned texture may be applied to the walls of various buildings for example. A concrete or asphalt texture may be applied to the ground. And so on, until all 3D objects are covered in these wallpaper-like textures.
Depending on your texture detail settings in a game, these textured surfaces should appear crisp and relatively detailed close up. However unfortunately as they recede into the distance away from the viewer, they start to show a strange blurriness, sometimes with clear gradations between levels of blur. Without getting overly technical, the problem is essentially that there just aren't enough samples of the original texture image being shown on screen, leading to distortion, loss of detail and general blurriness.
This is where a technique called Texture Filtering can help to raise the level of detail of distant textures by using increasing numbers of texture samples to construct the relevant pixels on the screen. To start with there are two basic forms of texture filtering called Bilinear Filtering and Trilinear Filtering. These texture filtering methods are 'isotropic' methods, meaning they use a filtering pattern which is square and works the same in all directions. However as we've seen the common problem with texture blurriness in games occurs on textures which are receding into the distance, which means they require a non-square (typically rectangular or trapezoidal) filtering pattern. Such a pattern is referred to as an 'an-isotropic' filtering method, which logically leads us to the setting called Anisotropic Filtering (AF). A screenshot comparison is shown below, demonstrating the way AF improves distant textures:
You can also view an animated game screenshot comparison of Anisotropic Filtering by clicking this link: Anisotropic.gif (868KB).
You may wonder why on earth game developers don't just build texture filtering into their games so that it automatically fixes up the blurry textures. The answer as you would expect by now is that advanced texture filtering can be graphically intensive and the higher the texture samples, the better the image quality but the lower the performance especially on older graphics cards. And so, just like Antialiasing, virtually every game, as well as your graphics card's control panel, give you a range of different types and levels of texture filtering you can enable or disable.
You can select from a variety of texture sample rates that range from 1x to 16x depending on your particular graphics hardware. The higher the sample rate the crisper distant textures will appear at the cost of lower performance. The performance impact will differ, however more recent graphics cards now use highly optimized methods of selective AF, and can do even 8x or 16x AF with minimal impact on FPS, even at high resolutions.
For the most part there are only two modes usually available for AF - Performance and Quality. The image quality difference between the two is not significant, so Performance mode is fine for most people. Performance mode uses bilinear filtering as its basis, while Quality mode uses trilinear filtering as the basis, but in both cases anisotropic filtering is obviously applied on top, giving a combined effect which is more detailed than just using bilinear or trilinear by itself. Remember that isotropic filtering patterns don't help much for textures receding into the distance, which is why anisotropic filtering is always necessary to truly reduce such forms of blurriness.
Texture filtering, like Antialiasing, is also affected by screen resolution. Since a higher resolution provides more pixels to sample, even without texture filtering, it generally makes textures look crisper. Regardless of the resolution, some games may still exhibit noticeable texture degradation as the textures recede into the distance, so some level of AF can always help improve image quality. Or if distant blurry textures aren't a major issue for you, you can disable AF altogether to maintain the highest FPS.
That finishes our look at various graphics and display settings. The next page brings the guide to a conclusion and has a range of resources for you to read if you're interested in finding out more about these settings.