Sticky

SLI / CrossFire FAQs - page 4

1270 answers Last reply
  1. And here is the final answer:
    http://www.anandtech.com/video/showdoc.aspx?i=3209

    AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870s on a single card.
    Reply to Maziar
  2. Let me fix the quote for you.

    Quote:
    AMD is the first out of the gates with the Radeon HD 3870 X2, based on what AMD is calling its R680 GPU. Despite the codename, the product name tells the entire story: the Radeon HD 3870 X2 is made up of two 3870 GPUs on a single card.


    The 9800GX2 is two PCBs facing each other, bleeding heat into the same heatsink, with a fan running air over it for cooling. Each PCB has a single GPU on it. The 3870x2 is a single PCB, with two GPUs on them. The one in the back has an aluminum cooler, with the GPU in the front has a copper one. It does indeed have two 3870s on it, but they are facing the same direction, on the same PCB. If you count "cards" based off of the number of PCBs, then the 9800GX2 is a dual card, while the 3870x2 is not.
    Reply to 4745454b
  3. If the things u say are true then why it says its actually 2 HD 3870 on a single card and also why in the tests when they use 2 HD 3870X2 then they call it QUAD-CROSSFIRE?
    Reply to Maziar
  4. I do not count PCB's as a card each.

    By that logic QuadFX was a quad core just like intels(and not 2 dual cores with all the downfalls people mentioned....even intel is 2 x dual cores anyway). since it was on the same board with 2 sockets and 2 memory controllers.

    the X2 and GX2's do the same thing with one or two boards they still have 2 full cards(minus the video outputs..). And thats the limit since each card has to have its own memory....imagine if they could have 1gig and both cards could use it(like how core2 has a shared cache. now that would be a dual core gpu and not two cards slapped together). As it stands now, high res and high AA do effect the small 512megabyte buffer.

    Just my 2 cents
    Reply to nukemaster
  5. @Maz, because they are counting the GPUs like you are. Two 3870x2 in CF would be four 3870 GPUs, hence "quadfire". As I said, its all how your count. PCBs, you have two. GPUs, you have four.

    @nuke, actually you might have just proved my point.

    Quote:
    the X2 and GX2's do the same thing with one or two boards they still have 2 full cards(minus the video outputs..)


    The x2 only has the three standard outputs that all video cards have. The 9800GX2 however has four, two for each card. Go read some reviews and figure out whats underneath those heatsinks. Obviously you guys aren't understanding/believing me.

    3870x2 http://www.newegg.com/Product/Product.aspx?Item=N82E16814121228
    9800GX2 http://www.newegg.com/Product/Product.aspx?Item=N82E16814500026
    Reply to 4745454b
  6. I think i got what u are saying
    u are saying 9800GX2 has 2 PCB so its actually 2 cards
    HD 3870X2 has one PCB so its one card
    right ?

    and the reason anandtech says its 2 HD 3870 is..........?
    Reply to Maziar
  7. so ?
    Reply to Maziar
  8. 4745454b said:

    The x2 only has the three standard outputs that all video cards have. The 9800GX2 however has four, two for each card. Go read some reviews and figure out whats underneath those heatsinks. Obviously you guys aren't understanding/believing me.

    3870x2 http://www.newegg.com/Product/Product.aspx?Item=N82E16814121228
    9800GX2 http://www.newegg.com/Product/Product.aspx?Item=N82E16814500026

    I only see 3 outputs 2 dvi and one hdmi, but for all i know the DVI and HDMI are linked(like on many dvi/hdmi onboard cards)....

    Either way its not 4 so :P

    I know EXACTLY whats under the heatsinks....2 cards on one pcb(ATI) and 2 cards on 2 pcbs. Either way the lack of Svid for SD tv users on the 9800GX2 SUCKS.....

    Look X2 with 4 video outputs
    http://www.techspot.com/review/86-ati-radeon-hd-3870-x2/
    Reply to nukemaster
  9. The reason Anand says it is two is because there are two GPUs, and you are already using one of the CF links. When you say card, I am old school and picture the PCB. Its a bit like saying the C2Ds is two CPUs. It's two cores, on one PCB. I don't see how this is any different.

    @nuke, I was referring to the only pic of the 9800GX2 that I've seen. Looking back at it, it is the 2 DVI ports, one DMI, and the fourth that I saw is probably an optical out for audio. Tell me what you think it is.

    http://www.tomshardware.com/2008/01/05/exclusive_nvidia_geforce_9800gx2/
    Reply to 4745454b
  10. ok thanks mate :)
    Reply to Maziar
  11. Well its not still in NVIDIA site, but i will put in the FAQ as soon as it comes out
    thanks
    Reply to Maziar
  12. I would almost call a PD and core2 Quad 2 cores(well 2 dies) on one pcb(substraight).

    Anyway, there is not point in auguring about this. Its a video card....good enough....

    Oddly they appear to have dropped the optical port. Any clue if Nvidia even has an onboard sound card for HDMI or could the optical have been an input for HDMI pass through?(warning wild ass guess!!!)

    A quick google shows this may be true...

    http://ketzone.com/blog/?p=152

    Maybe it was dropped because people thought it was an output when in fact its just an input for the HDMI port.
    Reply to nukemaster
  13. flip_x said:
    How did they get the BlackBird from HP to run xfire on a striker extreme board??


    HP worked with Asus and modified the chipset. It's not a stock striker extreme. And HP has the money, resources, and engineering muscle to do that.
    Reply to bliq
  14. Edited with lots of thanks to emp and 4745454b who helped alot about
    HD 3870X2 and 9800GX2 architecture
    Reply to Maziar
  15. I cannot make crossfire with PowerColor 1GB HD3870 X2
    My motherboard is GIGABYTE P35-DQ6. I have already upgraded latest bios from Gigabyte web server. I tried both Vista 32bit and Vista 64bit.

    What is the wrong with it?



    Reply to TuVNeRa
  16. If I'm reading everything right, there is no problem. If you have one 3870x2, you can't/don't need to enable CF. The CF link is made internally on the card, its invisible to the user. This also allows you to use the 3870x2 on non CF boards, either single slot or Nvidia chipset motherboards. You would need two 3870x2 to enable CF, which I don't see in your device manager. (you have two 3870x2's listed, but you should get one listing for each DVI port. Thats at least how it works with my older x1800xt.)

    Anyone feel free to correct me if I'm wrong.
    Reply to 4745454b
  17. I have only 1 3870 X2
    But a moderator from another forum has told me CF menu must be shown in Catalyst Control Center and there must be shown on device manager like this:



    Reply to TuVNeRa
  18. You may want to include something about the stability of 1 card vs multiple cards. Moving from one 1600xt to 2 and when moving from one 3870xt to 2, I started having some stability problems.
    Also I love my crossfire setups but when i moved from 2 1600xt's to a 1900xt it was an insane improvement. I know that there are lots of reasons for the difference but even though adding more cards to a system will improve performance I still like the idea of the single "Monolithic" card like mine was for me or how the 8800GTX was for a long time.
    I got excited when the 1GB-512bit 2900xt's came out, until they sucked. I hope ATI tries that again with these newer/multi cores to make crossfire more effective. I think even 1gb-256bit may work with PCI-E 2.0 but I wouldnt know if a 256bit bus is wide enough for a Gig of GDDR3/4.
    Anywho nice post.
    Reply to blotch
  19. I think, I' ve solved the problem.
    Because of Vista 64bit drivers do not work properly.
    This screen shot is from Vista 32bit:

    Reply to TuVNeRa
  20. What version of windows are you using?
    Reply to 4745454b
  21. I think he said VISTA 32bit
    But its strange it should be right in both VISTA 64 and 32 not only 32
    Reply to Maziar
  22. Im planning on buying a computer which has 4 gb of ram and a 2.4 ghz quad processor. I have the choice of getting a single 768 mb 8800 gtx nvidia graphics card or a Dual nvidia 8800 GT 512MB. Which is better?????
    Reply to vincegreg
  23. Im planning on buying a computer which has 4 gb of ram and a 2.4 ghz quad processor. I have the choice of getting a single 768 mb 8800 gtx nvidia graphics card or a Dual nvidia 8800 GT 512MB. Which is better?????
    Reply to vincegreg
  24. What resolution do you play at ?
    Reply to Maziar
  25. hi, i've got a quad core and a asus p5k E wifi, crossfire, and i've read in a before post that is possible in some boards to make corssfire boards into SLI boards, can this Asus P5K E WIFI be done that ? i've got a XFX GTS 8800 640 MB graphic and wan't to know if i can get another GTS SLI mode

    THX
    Reply to fast_furious
  26. Hello and welcome to the forums mate :)

    Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
    Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
    Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!
    Reply to Maziar
  27. Maziar said:
    Hello and welcome to the forums mate :)

    Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
    Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
    Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!


    thx mate, do u know someone that did it ? site's mates?

    thx :hello:
    Reply to fast_furious
  28. No problem mate :) glad i could help u :)

    Well actually i wrote this guide and i have read this thread :)
    go to the first page and you will see it :)
    Reply to Maziar
  29. Edited
    Reply to Maziar
  30. Edited again with special thanks to Paul :)
    Reply to Maziar
  31. Good job mate this very good for people thinking about SLi/Crossfire. A++
    Reply to fatty35
  32. Thanks alot mate :) Glad it helped you
    Reply to Maziar
  33. Just wanted to add a couple links supporting the usefulness of SLI at low res in rare cases like Crysis. You have to look at the 9800GX2 which is basically SLI G92 in a dual PCB single PCI-e slot fashion.

    Crysis very high:

    1280x1024 4xaa/16xaf
    http://www.digit-life.com/articles2/digest3d/0408/itogi-video-cr3-wxp-aaa-1280-pcie.html

    Notice the spanking the GX2 does to a single G92 like the 8800GTS 512MB. 30 fps vs 18 fps. No single GPU hits 19 fps in this one.

    1280x1024 very high no aa/af
    http://www.digit-life.com/articles2/digest3d/0408/itogi-video-cr3-wxp-1280-pcie.html

    Even without fsaa/af, it's still 35 fps vs 22 fps showing the playable difference of SLI even at 12x10 no fsaa.


    Now going down to 1024x768 with aa, there is still a big difference. 37 fps vs 25 fps:
    http://www.digit-life.com/articles2/digest3d/0408/itogi-video-cr3-wxp-aaa-1024-pcie.html

    And at a puny 1024x768 no fsaa, it's 38 fps vs 27 fps average.
    http://www.digit-life.com/articles2/digest3d/0408/itogi-video-cr3-wxp-1024-pcie.html

    There's proof right there that SLI doesn't need high resolution to scale properly. It just needs GPU demanding settings at whatever the resolution.

    Anyway, you will notice that the 9800GX2 leads alot of their low res gaming tests. But in most games the single GPU's still do well enough. If you don't hit a CPU limitation, SLI can scale well at low res, whether or not you need dual gpu's to be playable is a different story. But Without question Crysis very high warrants SLI at any resolution as a single GPU struggles.

    This goes along with why I think 9600GT or 8800GT SLI is a good option for people to consider over a single GPU of equal pricing. More often than not 9600GT SLI will beat a single 9800GTX, and sometimes it's a crushing. And to be honest, SLI and driver support have matured alot. It would be the rare game where it would lose because of SLI issues, and most of the time those games a single midrange 9600GT or 8800GT will be plenty anyway. It could be a new game needs a patch or driver support for sli to work properly or scale properly. But TWIMTBP usually makes that a top priority for NV to save face. Not saying there aren't good reasons to consider a single GPU vs SLI (there are), but just there are also good reasons to consider certain SLI midrange solutions over a top single gpu. The introduction of the 9600GT and low 8800GT prices have made them excellent SLI options.
    Reply to pauldh
  34. Thanks alot Paul :) as usuall your information is great and usefull
    Yes the drivers have changed alot and i have mentioned that too and also i have talked about the games that are GPU-limited and benefit from SLI even at low resolutions.
    Thanks again mate :) i aprreciate it :)
    Reply to Maziar
  35. :) No problem.

    I saw you made the changes and then just saw those Digit-life very high crysis tests this morning and I thought they were worth getting into this thread for some hard data to back up the changes. These were the best low res, very high Crysis benchies I have seen.

    Anyway, for anyone considering a high end GPU like a 9800GTX vs. SLI 9600GT. For $250-300 right now (USA anyway), Save money and buy a 8800GTS G92, or go SLI 9600GT or SLI 8800GT. Obviously exact price for everyone (around the globe) varies, so recommendations would change.
    Check each game here: http://www.firingsquad.com/hardware/nvidia_geforce_9800_gtx_performance_preview/page4.asp
    Reply to pauldh
  36. Thanks Paul, Edited again :)
    The reason i didn't post those links in the first page is that although they are true and right but i want to write more rather than posting links and as i said i have talked about SLI in low resolutions too.
    Reply to Maziar
  37. I know this might sound ridiculous after reading this very informative info. about SLI /crossfire. Well I still have something ask which I didn't quite get. I have a sli main board with a Nvidia chip set, running an ATI X1600XT (crossfire card). Now the question is, Can I use another ATI X1600XT (Cross fire or non Crossfire card) to make a SLI? And thanks to you Mazier for this post. I have indeed learned alot what I didn't know before.
    Reply to Jeanmarie576
  38. Jeanmarie576 said:
    I know this might sound ridiculous after reading this very informative info. about SLI /crossfire. Well I still have something ask which I didn't quite get. I have a sli main board with a Nvidia chip set, running an ATI X1600XT (crossfire card). Now the question is, Can I use another ATI X1600XT (Cross fire or non Crossfire card) to make a SLI? And thanks to you Mazier for this post. I have indeed learned alot what I didn't know before.

    The short answer is no. The long answer is that it is or at least was possible with hacked drivers on the some hardware. I'd say go with the short answer as the chances of the long one working for you are likely nil. :)
    Reply to pauldh
  39. Thanks alot Paul for your help :)

    To Jeanmarie:
    I have talked about it in the first page:
    Can you use SLI on a CrossFire board or can you use CrossFire on a SLI board ?
    Well , in general , the answer is NO. But it's said that if you hack the drivers , you can use SLI on a CrossFire board or CrossFire on a SLI board.
    Caution:There is no guarantee that if you hack the drivers then you use SLI on a CrossFire Motherboard or CrossFire on a SLI board, so do it at your own risk!

    And by the way, i am glad you liked it and find it useful
    Reply to Maziar
  40. Thank you both, Paul and Maziar for take the time to answer my question and clear my doubts;) and drivers is not my cup of tea since I got no idea of doing it; like I would have taking the risk cos I still will end up with cards I don't want anymore. Thanks again...
    Reply to Jeanmarie576
  41. BTW Maziar, thank you for your dedication to not only update but also follow up peoples ?'s on this FAQ. Like Cleeve's FAQ, it takes dedication to maintain. Much kudos to both of you.
    Reply to pauldh
  42. Thanks alot mate :) this FAQ would never be good if some guys like you,emp,nuke,4745454b,GreatApe.... wouldn't have helped me with it.
    I myself want to update this FAQ with all new technology and other things.

    Thanks again for helping me in this thread(and in this forum too) i appreciate it :)
    Reply to Maziar
  43. I read the article, and several others and have a couple questions/comments. I'm looking at picking up a new high end system soon and the SLI/Crossfire issue came up for me so I've been researching it while shopping for the system.

    First the "proprietary" hardware architecture raises a frown from me from both manufacturers. I'm not much into the old system of "my way" -- MicroChannel flopped with that versus the more open EISA standard, which was slower, at that generation and other such attempts at proprietary hardware have flopped over time due to folks getting annoyed about it but, if a "standard" is to be born, it often starts in such a way so it's something I can live with...

    One thing that struck me about this all of this: How the 2 technologies seem to run. with respect to "operating" and "deployment/maintenance" If you find errors in this, please let me know.

    SLI uses a recognition model similar to the old Voodoo 2 model of drivers that recognize the software "image" for multiple cards to function for a software package. If the drivers don't recognize it, the video config drops back to a 'default' behavior of 1 card and simply skips all the fancy multi-card stuff so, for most "non-supporting" software, you gain nothing from having 2 cards.

    That old "Voodoo" model was hel". Every time any of your games or apps were updated, the "image" changed so the card would stop working in dual mode until you updated the drivers. With modern day update services, this shouldn't be a big issue -- it can automate the checking 'in the background' -- but is that going to be from NVidia or from each app manufacturer? (as in any software supporting the system will have to submit app image information to NVidia and NVidia will release the driver updates "when they get to it" or what not... Then you have the "fragmentation" potential as you have 20 different games and apps that get patched differently so each goes out to update the drivers or a consolidated update done daily or what not (meaning you lose "multi-card support" until drivers are updated at times...) See where I'm going with "image" and "drivers"? A LOT of potential changes to your "drivers" if this is still the model being proposed -- just picture a "bad update" where NOTHING recognizes 2 cards or the like...

    Crossfire is more transparent to the users. It tries to work with whatever is being sent to the video card. It doesn't require custom drivers to be updated for the cards to recognize the "supporting software" versus the cards trying to render whatever is to be displayed on the screen as best as it is able to. It also only uses the "single card" fall back as a "last resort" after trying to render it with multiple cards based upon "advanced" user configurable settings in the drivers.

    Now -- this I gleaned from a few different articles and the issues don't seem covered very well but... As you can see, if one is putting the load on companies to release driver updates with every patch from "hundreds" of application companies while the other is transparent... I think I know which one I'll be more comfortable choosing. That transparency is something I kind of like versus a potentially "busy system" as it keeps updating drivers...

    Again, if you have any info on this maintenance "use" side of the 2 technologies, I'd appreciate it. I really don't have much on it beyond references to how SLI decides to "fall back to default mode" and how the Crossfire has configuration options on how multiple cards should handle video output "by default"...
    Reply to Eleazaros
  44. That issue you said, if its only with Voodoo then the problem is with Voodoo but if happend in other system manufacturer then its the Nvidia's fault.

    Also i will edit this FAQ with new informations about SLI/CrossFire
    Reply to Maziar
  45. Just wonder do PC games that support multi-gpu configuration (SLI or Crossfire) really experience better Visual Graphics then PC games using single graphics card. Looking for anyone that can give me real world experience answer in this area.
    Reply to ErikS22
  46. Well in some cases the difference is alot, for example in Crysis when you play at 1920x1200, for example 2 8800GTX perform alot better than one 8800GTX and you will see/notice the difference.
    Reply to Maziar
  47. Edited.
    Reply to Maziar
  48. Edited with a new chart.
    Reply to Maziar
  49. I think it would be nice to have a bit about multiple monitors as well.

    1) You have 2 or three monitors and two GTX 260s or 3870 X2s . Lets say your middle monitor is 30" and the side monitors are 20-24". Let's also say you have three 20" full screen.

    What are the possibilities for gaming? For gaming, will the side monitors be disabled and both cards will power the single monitor? Will all three monitors remain active with the primary monitor getting a boost from the second card? Can all three monitors be used in any game without a solution such as TripleHead2Go?

    2) What about dual monitor support? How do these multi card solutions work with them?

    I've been told that ATI supports triple monitors by allowing the side monitors to be disabled automatically, whereas Nvidia makes it a pain.

    I don't know any of these answers, but sure would like to know. I think it would be a great addition to your fantastic FAQ.
    Reply to jtabler
  50. Well i want to mainly talk about MULTIGPU itself and the advantages/disadvantages of this technology and other things.I want to focus on this not on the monitors sorry .
    Reply to Maziar
Ask a new question Answer

Read More

Product Nvidia Crossfire SLI Graphics Graphics Cards