Sign in with
Sign up | Sign in

Morphological Anti-Aliasing

AMD Radeon HD 6870 And 6850: Is Barts A Step Forward?
By , Chris Angelini

Morphological anti-aliasing (AA) is an all-new option for the Radeon HD 6000-series cards. It presents a different approach to the aliasing problem in that it needs no insight into the makeup of the scene’s geometry; morphological AA is a post-process filtering technique, accelerated with DirectCompute and compatible with any application from DirectX 9 to 11 (in theory). After a frame is rendered, it is passed through the morphological AA shader that looks for high-contrast edges and patterns consistent with aliasing. It then blends the colors of adjacent pixels to approximate a smooth transition along a line instead of aliased steps. This means that the smoothing effect isn’t limited to the edges of geometry or alpha textures like CFAA; it applies to all of the pixels in the scene.

Conceptually, this method promises results similar to super-sampling, but with performance comparable to edge-detect AA. AMD suggests that some applications will look better than others, and that the technique is not ideal for all scenes and games. So, we tried it ourselves to see what the actual result looks like.

Morphological anti-aliasing works independent of regular anti-aliasing settings, so controlling it might seem confusing at first. Even with the “Use application settings” checkbox enabled, the feature should work, so long as it’s enabled. It can even be used in conjunction with regular AA modes, if you want.

With morphological AA enabled, we fired up Battlefield Bad Company 2. The post-processing filter didn’t seem to make any difference at all. Careful checking verified that, indeed, it does not seem to work with this title. So much for automatic compatibility with all DirectX 9 to 11 games.

This brings us to StarCraft 2, a title in which we're quite interested due to the fact that Radeons are currently known for slow AA performance in this game. With morphological AA enabled, the results are visually obvious, and yet performance remains almost as fast as it is with the feature turned off (roughly twice as fast as it is with 4x multi-sampling, and about four times as fast as 4x super-sampling). This is a much more impressive result.

Performance can only be as impressive as the image quality however. And while morphological AA does smooth out the edges, it can have some less-than-ideal effects on the appearance of the scene. I noticed a little crawling on the edges of moving objects--a very slight shimmering. This is not surprising, as the post-process effect does not have access to geometry details. It’s doing its best on a frame-by-frame basis, but there is likely no temporal information stored to smooth edges between frames. The effect becomes less noticeable as the resolution increases, as with most aliasing artifacts.

The worst-case scenario for morphological AA is low resolutions combined with text. The effect appears to work on a per-pixel basis, which makes sense. But the unfortunate side effect is that, since there are fewer pixels in a low-res scene, there can be undesirable smoothing on things that you’d prefer would stay sharp. Text is a prime example of this. The smoothing isn't as apparent at higher resolutions. Here are some examples of what I mean:

It’s important to note that because morphological AA is a post-process effect, the resulting output can’t be captured with a regular screen capture utility like FRAPS. The comparison images above were created with an application that AMD provided. It uses the same code in the driver to modify the output, just as morphological AA does when running a game.

In any case, morphological AA is an interesting addition to the existing AA tool set. Folks running high resolutions may find themselves enabling the feature on almost everything because of its low performance impact. Users with displays limited to lower resolutions might want to consider regular AA modes due to the visual artifacts on text. The good news here is that more conventional AA techniques tend to be more playable in low-resolution environment anyway.

Display all 23 comments.
This thread is closed for comments
  • 0 Hide
    jamie_macdonald , 22 October 2010 16:40
    Will wait to see them in a real PC first ... sound half OK though, see what the high end ones perform like ^^
  • 0 Hide
    tranzz , 22 October 2010 18:11
    would be nice to see dynamic tessalation from the software. So visuals scale to maintain frame rates across a variety of cards.
  • 0 Hide
    nesters , 22 October 2010 18:31
    As much as I have seen, those cards scale pretty good in CrossFire, in some games HD6870CF outperforming HD5870CF.
  • 1 Hide
    mi1ez , 22 October 2010 19:52
    WHATWHATWAHT!
    Quote:
    But this sure would be a good time to introduce a card with a fully-equipped GF104 and 384 CUDA cores enabled (Ed.: I can’t comment, but I know something that you don’t, Don).

    Naming scheme aside, these look like pretty competent cards and I for one am looking forward to the high end and indeed 7-series cards these are supposedly leading up to.
  • 0 Hide
    the_krell , 22 October 2010 20:02
    Just to correct your Shakespeare...
    Wherefore art thou, Radeon 6700? Should read Wherefore art thou, Radeon 6800?

    Wherefore means Why? For what reason?

    :) 
  • 1 Hide
    aje21 , 22 October 2010 20:16
    Quote:
    Meet AMD Accelerated Parralel Processing (APP)

    At least the image had Parallel spelt correctly ;-)
  • 0 Hide
    dizzycriminal , 22 October 2010 20:59
    All the other review sites have found the 6850 performs better than the 460 1Gb. So im not sure what to believe. Apart from now is looking like a good time to get a new GPU. A 6850 is looking like the way to go.
  • 0 Hide
    lemonadesoda , 22 October 2010 21:45
    Read each review "test setup" carefully. You will see that tom has used CURRENT and not LAST MONTHS drivers. Read the intro. There is a slight OC on this GTX460 BECAUSE all 68xx cards are OC'ed. Tom used the same average OC.
  • 0 Hide
    aln135 , 22 October 2010 22:08
    These two new mid range cards seem pretty good and considering they are £150 for HD6850 and £200 for HD6870 a very good buy indeed. cant wait for the reviews of the HD6970 and HD6990 though
  • 0 Hide
    Anonymous , 22 October 2010 23:28
    Why are you benching with NoAA. At least use 2xAA. Unless your tested show that the HD68xx can't hack AA.
  • 0 Hide
    sam_p_lay , 23 October 2010 04:58
    Mi1ez - we already had the Radeon 7000 series. That's where it started, before moving onto the 8500 and then the 9700 and 9500. ATi/AMD have gone right around the clock now - they may need to come up with a new name. Or, just follow nVIDIA's example and knock off a '0' and start using 3-digit model numbering.

    Did anyone else notice a few charts with the size of the bars not matching the numbers? There was a 5850 score of 23fps that I'm pretty sure was meant to read 33fps based on the fact that the 27fps GTX460 score below it had a longer bar.

    Also surprised THG didn't make a bigger deal of the 94% FPS gain from Crossfiring these new Radeons... that's even better than the average gain from two-way GF104/GF106 SLI! And if morphological AA can deliver supersampling level smoothing with negligible FPS and definition loss at decent res, that's some very attractive value add!
  • 2 Hide
    LePhuronn , 25 October 2010 02:38
    sam_p_layMi1ez - we already had the Radeon 7000 series.


    OK, Radeon HD 7000 series, Mr Pedantic ;-)
  • -2 Hide
    Anonymous , 25 October 2010 15:21
    looking at the local prices here the 5850 is the same price as the gtx460 and the 5870 is more expensive than the gtx470

    Looking at that and the quality of drivers it's easy to say that NVIDIA is the pick of the moment, this might change if prices start shifting around
  • 0 Hide
    makrish , 25 October 2010 20:59
    putsomethingherelooking at the local prices here the 5850 is the same price as the gtx460 and the 5870 is more expensive than the gtx470Looking at that and the quality of drivers it's easy to say that NVIDIA is the pick of the moment, this might change if prices start shifting around


    Not the 5870 or the 5850, the 6870. The next gen. The 6870 is about 200 pounds on ebuyer.com (UK prices), and the 5870 is 280, and the 470 is about 250. This puts the 6870 at 50 pounds cheaper, and it scales better than 5870's in CF. Realistically, this is the best card out there for its price range. I'd prefer two of these to a 5870 or even a 5970.
  • 1 Hide
    LePhuronn , 25 October 2010 21:27
    Roll on HD 6990 vs GTX 485

    Full-fat Southern Islands vs full-fat Fermi

    XD
  • 0 Hide
    ben BOys , 26 October 2010 00:11
    goes to show price/proformance is outrigh winner against proforma=nce on it's own. Nvidia need to step up there game if they want catch there falling crown
  • 0 Hide
    williehmmm , 26 October 2010 01:37
    makrishNot the 5870 or the 5850, the 6870. The next gen. The 6870 is about 200 pounds on ebuyer.com (UK prices), and the 5870 is 280, and the 470 is about 250. This puts the 6870 at 50 pounds cheaper, and it scales better than 5870's in CF. Realistically, this is the best card out there for its price range. I'd prefer two of these to a 5870 or even a 5970.


    Ebuyer shows the 6870 starting at £193, but the GTX 470 starting at £186.

    Hopefuly this cut throat competitve pricing does remain, then we all benefit.

    I paid over £300 each for my pair of GTX 470s, 7 months ago. Depreciation of £230, but by golly, I've had a lot of 3D stereoscopic fun in that time!
  • 1 Hide
    Silmarunya , 26 October 2010 03:32
    While their naming scheme is pretty absurd, the cards themselves deliver. Nvidia's GTX 460 enjoyed a brief moment of utter dominance at the €200 price point, while the new 6850 offers same performance, same price and lower power consumption. As such, Nvidia's Fermi series now fails to dominate a single segment of the market. Sad for a company that wore the performance crown for years...

    The 6870 isn't a clear winner in its market segment, but it's certainly worth buying. As such, I think AMD just made another winning move.

    Bring on the 6970, with some luck it's enough to hand the single GPU performance crown back to AMD. And then Nvidia has... erm... CUDA and PhysX. The first isn't useful for gamers, the second is nothing more than a gimmick. Life's good for AMD fans atm :) 
  • -1 Hide
    williehmmm , 26 October 2010 17:59
    And then Nvidia has... erm... CUDA and PhysX. The first isn't useful for gamers, the second is nothing more than a gimmick. Life's good for AMD fans atm


    Nvidia has 3D vision and a backbone of support for 3D sterescopic compliance in significantly more titles. AMD is going to be catching up for years and even then relying on 3rd party software, drivers and peripherals, so they have very little influence in getting it to work right.

    Not to mention that if you've spent several hundred pounds/euros on a high end 5xxx series card, you now have to ditch that and spend out again to get 3D functionality.

    Nvidia have made cards as low spec as the GT220 & GT240 3D bluray compatible, just by updating that support in drivers back to those cards. That AMD can't even do that for its flagship card, the 5970 which folk have have spent £400 - £500 on, it's just a sin.

    AMD fans seem to be getting fleeced. Despite Nvidia's aggresive pricing (a good thing for everyone) and their PR machine kicking into gear in reaction, at least they keep adding functionality to their existing line up of cards.

    As for Nvidia not dominating a single segment, neither do AMD. The price /performance statistics show they are even, there is no outright winner, it's a score draw. If you're an AMD fan you're getting a good deal buying their card, if you're an Nvidia fan, you're getting a good deal buying their card.
  • 1 Hide
    sam_p_lay , 27 October 2010 05:01
    LePhuronn - Radeon HD 7000 had occurred to me, but I hope to God they don't do it! Reckon they've got their money's worth out of the Radeon name and should come up with something new... was disappointed with nVIDIA for sticking with GeForce after the 9 series, though I suppose it makes sense with such a strong brand profile.
Display more comments