AMD Radeon HD 3800: The Empire Strikes Back

Direct3D 10.1: Quality, practically

Aiming at quality

With Direct3D 10.1, Microsoft has focused on rendering quality more than any of the other new features, so to speak. And the main focal point was antialiasing. First news: from now on, the support of antialiasing 4x is compulsory for 32-bit (RGBA8) as well as 64-bit (RGBA16) buffers. Furthermore, samples’ position is also specified by the API and must be configurable. Without going as far as the ability to freely program samples’ position, an application must at least be able to choose between many predefined patterns.

Beyond more strictly defined specifications, Microsoft has also sought to rationalize a little antialiasing management by offering much more control to programmers and by resorting less to the GPU manufacturers’ homemade recipes. One has to admit that until now, users had access to a number of options quite disconcerting to beginners: apart from antialiasing levels (2x, 4x, 8x), the user had access to transparency antialiasing to filter alpha textures either in multisampling or supersampling mode, and on top of that were added specific features from each Independent Hardware Vendor (IHV): CSAA or CFAA… With Direct3D 10.1, programmers can finally specify if they want multisampling or supersampling by primitive and he also has access to the coverage mask of each pixel, which grants him control on samples on which shaders are applied.

Radeon HD 3800

Finally, whereas Direct3D 10 enabled the access to samples of a multisampled color buffer, it’s now possible to do the same thing in a multisampled depth buffer.

Practically, most of those features aren’t new. Each manufacturer more or less included them in its own way and allowed their activation in its drivers. What’s really new is that Direct3D 10.1 finally allows all this to be opened to games’ programmers. Henceforth, driver’s programmers will no longer be in charge of developing new antialiasing modes but games’ programmers will now handle it according to the specific needs of their engines, a little like what is already happening on consoles where programmers have access to a lower hardware level.

Microsoft therefore gives the best there is to developers while waiting for totally programmable ROP, which would make all this even more flexible and clearer.

And practically?

Practically, don’t hope for much in the meantime. We are still waiting for developers to master Direct3D 10 and for them not to be limited by the Direct3D 9 versions of their engines that they still must upgrade, so there’s little chance that they’ll run towards Direct3D 10.1; the hardware is barely out and the API won’t be available until Vista’s Service Pack 1 in 2008.

Nevertheless, some features should allow for interesting effects. Specifically, Cube Map Arrays could simplify dynamic reflections, even if one must not forget the impact on other portions of the pipeline. Actually, in today’s games, dynamic reflections are usually only applied to main elements (and the frequency of the reflections’ update is far less important than the screen’s refresh rate) in order to save some fill rate. If Cube Map Arrays take away a restriction on the number of simultaneous reflection, it doesn’t cancel the others. We’ll thus wait to really appreciate it in games, rather than in a handful of demos formatted by AMD or Microsoft.

Independent blending modes for each buffer when using MRT should ease the development of deferred shading rendering engines. Combined with possibilities to read antialiasing samples of color and depth buffers, those engines won’t be forced to abandon antialiasing for a vague blur that is of questionable interest.

The other new features bring more additional comfort to developers than they truly do to gamers.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • darthpoik
    Are AMD/ATI ever going to produce a card that actually beats the geforce 8800gtx. It has been ages. I have a gtx and it is a bit of an anticlimax being able to turn everything upto full and play normally. I think I liked it better 'wishing' to be able to do so and overclocking to get closer.
  • LePhuronn
    It's anti-climactic to be able to play any game at maximum quality? Um...OK.

    Personally I'd be happy with that as it means my £400+ investment will last me a good number of years.

    If you're disappointed that there's games you can't play I have a 6800 Ultra I'll happily swap for your 8800 GTX

  • spoonboy
    512mb versions of the 3850 (yes there are some) look like a total bargain. Overclock to a 3870 with all features and video memory, for pretty small beans. Good job ATI.
  • spuddyt
    why did you buy an 8800 gtx if you didn't want to turn everything to full? anyway its interesting about the dx10.1 bit....
  • darthpoik
    Thank you LePhuronn for the offer, but I think I will pass. I like the card but just expected something more having never owned the top graphics card before. AMD/ATI still need to produce a gtx beater so that we can get a beafier nvidia card.
  • darkstar782
    actually 8800GTX cant run everything in max detail by any means.

    I have an 8800GTX SLI system and I struggle with Crysis on medium settings @2560x1600.

    I'm waiting for something faster... whether it is from ATI or nVidia.

    ATI continue to disappoint.
  • sosrandom
    I get what hes saying, the GTX is a year old and can run UT3 at 1600x1200 with AA very comfortably
  • nicolasb
    "The Empire Strikes Back"? No: 8800GTX was the Empire striking back; we were hoping RV670 would be "Return of the Jedi", but it's turned out not to be. (Possibly because of a lack of ewoks).

    On a more important note: where are the Crossfire and SLI scores? The great thing about these new cards (both the 8800GT and the 3850/3870) are the fact that you're getting what, just months ago, was enthusiast-level performance for mainstream-level pricing. This makes SLI and Crossfire immensely much more affordable than they have ever usefully been before.

    Previously it was always the case that you got better price/performance from a single high-end card than you got from two mid-range ones. Now, for the first time, that may no longer be true: 3850s in Crossfire might even outperform 8800GTX some of the time, and they're actually *cheaper* than single GTX.

    So, come on: where are the benchmarks?

    Finally, your noise level measurements are obviously flawed: you've got a 43dB noise floor, resulting from components other than the graphics card, or possibly from stuff going on outside the case. So it doesn't matter how quiet the GPU fan goes, you'll always read ~43dB. The cooler on the 3850 is rated at just 31dB, which is *miles* below the noise level you get from an 8800GT. Your figures are misleading.
  • perzy
    What about the real important issues well does it run folding@home? Huh, THG why dont you comment on that?
    At least I want to know if it's as good as the 1900 at crunshing lifesaving data!
  • inthere
    you can't play Crysis with everything maximum with an 8800 gtx, not at any res 1600x1200 and over
  • bobwya
    The whole reminds me of when I couldn't play Doom 3 @1600x1200 at Ultra quality. Now I can with an upgrade from to 128Mb 9800Pro to watercooled 512Mb OC X1950 Pro... and get 60FPS constantly.

    With AMD/ATI going down the toilet Nvidia is not getting enough pressure to move on the next generation (1Gb+ cards with enough horsepower to handle HD gaming). Rebranding 2xxx cards as 3xxx is pretty desperate!! That is the bottom line... Even with VERY deep pockets you will still struggle to get high quality textures running @1920x1200 (native 1080p the true resolution of BD and HDDVD disks).

    I have a feeling that AMD/ATI may not be around much longer. If a company isn't diversified (like Sun) then a failure in your core business means you are pretty screwed. If ATI didn't have products like the X1950Pro they would be in real trouble already...

    nicolasb -> Previous comment about Crossfire. THG said the driver was unstable for the new 3xxx cards in the introduction. Perhaps you have problems reading??

    Ah well, have to wait till 2009 for that monitor upgrade!!

  • RichUK
    I don't suppose this card would perform better on an AMD 770 chipset? Seeing as they're marketing it as the whole "Spider" platform and whatnot. Just interested to know if there is anything in their marketing other than the scaling.

    I look forward to seeing what kind of scaling these will produce, because that does seem to be their main selling point. As the previous guy said, you can get 2 of these cards for less than a GTX, and potentially equal perfomance, while still leaving room for another 2 cards =)

    The HD 2900 XT's scaling results were actually pretty impressive, SLI showing a 50% boost at best, while crossfire showing as good as 90% in some games.
  • perzy
    Well what about folding@home ? Is the new cards as good at that as the 1900-series? Hello! Reality check THG!