Sticky

SLI / CrossFire FAQs - page 2

1270 answers Last reply
  1. Like what?
    Reply to trek6500
  2. Early testing shows the SLI GT outperforming consistently the GTX (Which is double the price), but we have to wait and see if a price adjustment occurs in the upcoming weeks before you jump into any conclusions, for the GT, GTS, and GTX.
    Reply to emp
  3. a single 8800GTX costs a little more than a 2 8800GT in SLI, and it consummes less power ,less heat and also more memory , however in games that benefit from SLI, 2 8800GT beat a single 8800GTX
    Reply to Maziar
  4. Single GTXs can be had in the US for $450-470, so it's not always more than twice the price (especially not with the current GT price gouging), but when compared to the Ultra, once again you get into a 2+:1 price ratio.
    Reply to TheGreatGrapeApe
  5. Hi, I have a small question.

    when you go sli it's better to have the same card two times or it's possible to have cards coming from different company. I already have a evga 8800gts 640mg superclocked and want to add a second one. But I find some whith a lower price and it's not evga. Can you tell me if there some problems you know could happen from the mixed up of two card.

    thanks
    sorry english is not my first langage.
    Reply to jive
  6. What about brands ?
    Well the brand doesnt matter , again for example , you can use XFX card with EVGA card , or SAPPHIRE card with
    DIAMOND card , just make sure they have the same memory and same clocks.
    Reply to Maziar
  7. Can I mix and match graphics cards with different sizes of memory?
    While it is not recommended, NVIDIA does offer this flexibility using Coolbits. When purchasing a second graphics card, you should try to match the memory size so that you are ensured full value and performance from your purchase. For example, if your first card is a GeForce 6600 GT with 128MB of memory, you should purchase a second GeForce 6600 GT with 128MB of memory. However, using Coolbits (value set to 18), you can force both of the cards to use the lower of the two memory sizes and operate together in SLI mode. When dissimilar memory sizes are enabled to work together using Coolbits, the effective memory size for each card becomes the smaller of the two memory sizes. Instructions to enable this feature can be found here.

    Can I mix and match graphics cards is one of them is overclocked by the manufacturer?
    Yes. A GeForce 7800 GTX that is overclocked (for example BFG GeForce 7800 GTX OC) can be mixed with a standard clocked GeForce 7800 GTX

    http://www.slizone.com/page/slizone_faq.html
    Reply to kpo6969
  8. Thanks for the information
    Reply to Maziar
  9. Maziar said:

    Warning: AMD/ATI cards need a CrossFire edition of its model to run at CrossFire mode (With the exception od AMD/ATI HD series)


    I have two of the same model X1950 Pro's 256 MB working together. Neither is branded as a crossfire edition. The two that I have are made by Sapphire. This may not be the case for all X1950 varieties out there though.
    Reply to ira176
  10. thats strange, but thanks you told me, i will edit my post
    Reply to Maziar
  11. Thanks for all this info - really helpful but still a bit confused.
    I have just upgraded to a Asus mobo as it has 2 PCIE slots which both run at x16 as I want to go to crossfire later- the manual says that the board supports crossfire. Does this mean I don't need a crossfire version of the second graphics card and can just buy a second "standard" card which matches my current one?
    Reply to Blue frog
  12. What is the model of the card your are considering ?
    Reply to Maziar
  13. I already have a sapphire X1950XT and was planning on the same.
    Reply to Blue frog
  14. Thanks
    Reply to Blue frog
  15. No problem, glad i could help
    Reply to Maziar
  16. Maziar said:
    thats strange, but thanks you told me, i will edit my post


    Maziar,
    I know it sounds strange, but maybe these cards are natively "crossfire edition" each. Here's a pic of the two exact same card models out of the box and inside the box:
    http://img455.imageshack.us/img455/797/computerpixmay92007002yb0.th.jpg
    http://img455.imageshack.us/img455/1278/74039333nx6.th.jpg
    Reply to ira176
  17. I guess this information from Wikipedia.org sums it up.


    "X1900 and X1950 series
    The X1900 and X1950 series fixes several flaws in the X1800 design and adds a significant pixel shading performance boost. The R580 core is pin compatible with the R520 PCBs meaning that a redesign of the X1800 PCB was not needed. The boards carry either 256 MiB or 512 MiB of onboard GDDR3 memory depending on the variant. The primary change between R580 and R520 is that ATI changed the pixel shader processor to texture processor ratio. The X1900 cards have 3 pixel shaders on each pipeline instead of 1, giving a total of 48 pixel shader units. ATI has taken this step with the expectation that future 3D software will be more pixel shader intensive.[10]

    In the latter half of 2006, ATI introduced the Radeon X1950 XTX. This is a graphics board using a revised R580 GPU called R580+. R580+ is the same as R580 except for support of GDDR4 memory, a new graphics DRAM technology that offers lower power consumption per clock and offers a significantly higher clock rate ceiling. X1950 XTX clocks its RAM at 1 GHz (2 GHz DDR), providing 64.0 GB/s of memory bandwidth, a 29% advantage over the X1900 XTX. The card was launched on August 23, 2006. [11]

    The X1950 Pro was released on October 17 2006 and was intended to replace the X1900GT in the competitive sub-$200 market segment. The X1950 Pro GPU is built from the ground up on the 80 nm RV570 core with only 12 texture units and 36 pixel shaders. The X1950 Pro is the first ATI card that supports native Crossfire implementation by a pair of internal Crossfire connectors, which eliminates the need for the unwieldy external dongle found in older Crossfire systems."


    It seems that only the X1950 Pro which is based on the RV570 core supports crossfire natively. The other X1900/50 cards, may still need the crossfire edition card, and use the external crossfire dongle cable.
    Reply to ira176
  18. thanks for the info :)
    well you maybe right, maybe they are both CrossFire edition
    Reply to Maziar
  19. Hi everybody,

    All this info is really nice, but doesn't solve my problem

    I run Vista Ultimate x64 on an ASUS M2N32-SLI Deluxe motherboard with 2 Asus EN7950GX2.

    I've tried everything and searched all over the web, I just can't get any SLI options in the Nvidia Control Panel.

    I've re-seated both card, got the latest driver, udated my motherboard's bios, did all of Vista's Update. It's no use.

    Anybody know the solution to this ?
    Reply to Zeddicus
  20. I thought it was a bit odd that you'd take the time to write this as I've already posted such a thing several times in the forums. This website has answers to nearly all the SLI questions you could come up with. http://www.slizone.com/page/slizone_faq.html Frankly, I would have just told people to go there instead of trying to write what you wrote. As GGA said, there are some problems, so lets get at them.

    Quote:
    BUT the 7800GTX 512 will lower its clock to operate with the 256 one , so it wont have its true power.


    Incorrect. The 512MB card won't lower any of its clocks at all. It will simply use only half of its ram. The clocks will stay the same, so it will still use its "true power". I would also mention that to do this, you have to use coolbits. If this were to be a sticky, I would post what you need to do in coolbits to make this work. From what I've heard, GF8 cards don't work with coolbits, so I'm not sure this can be done. You also failed to mention whether this is possible with Crossfire.

    Quote:
    just make sure they have the same memory and same clocks


    Uhmmmm, I thought we said memory and clock speeds don't matter? I know what you are trying to say. Why spend the money on a card with more Vram if you aren't going to use the extra ram? If you already have a card with 256MBs of ram, buying another card with 256MBs of ram would be more cost efficient, as buying one with 512MB would cost more.

    Quote:
    WARNING: Some RAMs have SLI or CrossFire logo on them.


    Seeing as you brought these up, you need to explain (briefly) what these are.

    Quote:
    WARNING: Therer are some Power Supplies which aren't in the List


    There are, or There're

    Quote:
    (I dont mean SLI or CrossFire wont be good for resolutions like 1600x1200 or lower , i am just saying that SLI or CrossFire shines in higher resolutions .


    You have a starting bracket, but no ending one. (you are missing one of these ")" ) At the start of the next sentence you start with the word Also. This is a grammatical no no.

    Quote:
    Do SLI or CrossFire double the memory ?


    Does....

    Quote:
    This may not be the case for all X1950 varieties out there though.


    There are three x1xxx cards that support crossfire without a crossfire edition card. The first is the x1950pro, next is the x1950GT, and the last is the x1650XT. These are the last three cards to come out from ATI, and the only three that have the internal crossfire bridge. The only card(s) that allows for Crossfire is another card with the same chipset. Read this for more info. http://ati.amd.com/technology/crossfire/howitworks.html (look for the squares that are colored red in the middle with a dot in them.)

    Seeing as at one point you used the wrong abbreviation for crossfire, I thought I'd point out the right one. The first few times I'd write Crossfire, but after that, you can switch to CF. I'd also try a bit more with the "2 vs 1" section. Getting two 8800GT is cheaper then a single ultra, and is probably a much better idea. Last, you seem to be writing this a a SLI page, if its going to include CF information, include CF information. Reading this, it seems to be SLI first, with a CF afterthought.
    Reply to 4745454b
  21. I noticed Toms latest review of Crysis in SLI states "two gts cards = 1500 meg - or whatever" but you say it doesn't work like that. I had also read that only one cards memory counts in dual GPU configurations.. Where's the truth then??

    Ryan
    Reply to Ironnads
  22. here:


    • Two SLI-enabled Nividia GeForce 8800 GTX @ 769 MB (total available graphics memory 1535 MB)


    Is this shaboddle then?
    Reply to Ironnads
  23. There might be 1.5GBs of memory total across both cards, but when looking at the amount a SINGLE card has access to, then no. It important to think of the frame buffer. Each card would only have 768MBs of space for the frame buffer. This is important because the more things you want to do in this frame buffer, the more space you need. If you want to enable 8 levels of AA, you are going to need more memory to handle that amount then if you wanted only 2 levels. 1600x1200 needs more memory then 1280x1024.

    Is it "shaboddle"? You could say no, as each card has 768MBs of memory, with the "total available graphics memory 1535 MB". You could also say it is, as each card is still limited to 768MBs of memory.
    Reply to 4745454b
  24. Ironnads said:
    here:


    • Two SLI-enabled Nividia GeForce 8800 GTX @ 769 MB (total available graphics memory 1535 MB)


    Is this shaboddle then?

    That includes card's memory and shared system memory.
    I have a 8800GT SC 512mb and 3GB system memory.
    total available graphics memory 1791mb
    dedicated graphics memory 512mb
    dedicated system memory 0mb
    shared available system memory 1279mb
    Reply to kpo6969
  25. I don't think so. Why does your 8800GT share system memory? This is for cards that are onboard, or turbo cache type cards, not cards with their own dedicated ram. I also don't understand where the 1791MB comes from.

    Ironnads, can you link what you quoted please? Perhaps it makes more sense if there is more context involved. I do know what I wrote though. Each individual card is limited to the ram that it has, it can't use the other cards ram. (although if they are turbo cache cards or use system ram, then they have what ram in onboard plus whatever system ram they are allowed to have. Why you would want to SLI TC cards I'll never know...)
    Reply to 4745454b
  26. 4745454b said:
    I don't think so. Why does your 8800GT share system memory? This is for cards that are onboard, or turbo cache type cards, not cards with their own dedicated ram. I also don't understand where the 1791MB comes from.

    Ironnads, can you link what you quoted please? Perhaps it makes more sense if there is more context involved. I do know what I wrote though. Each individual card is limited to the ram that it has, it can't use the other cards ram. (although if they are turbo cache cards or use system ram, then they have what ram in onboard plus whatever system ram they are allowed to have. Why you would want to SLI TC cards I'll never know...)

    VISTA
    Did you ever hear of it?
    Did you read what I posted?
    "dedicated graphics memory 512mb"
    http://www.microsoft.com/whdc/device/display/graphicsmemory.mspx
    Reply to kpo6969
  27. Yes I did read what you said, you never mentioned vista. You also said it was the result of the local ram along with the shared system memory. I do admit that vista might have changed things, rather then act like the @$$ you have proved yourself to be, why don't you KINDLY (ever hear of that?) explain to us how vista works differently? Why would vista use system ram for the video card?
    Reply to 4745454b
  28. 1.The explanation is in the link I posted, dated Jan 2007 by Microsoft
    2.Your insults mean nothing to me, I am not the one with a vendetta
    3.If you want to discuss facts that's fine. I do provide documentation for what I post
    4.I only replied with facts about your mis-statement in the midst of your attack of the OP's sticky thread
    Reply to kpo6969
  29. if you want to talk about the Thing i have written, then i am open to suggestions but if its something which is not related to my thread,please solve this somewere else :)
    thanx
    Reply to Maziar
  30. Maziar, it has come to my attention that I was attacking your post. I thought I was pointing out areas that needed a bit more work before this was stickied. My apologies if your feelings were hurt, though I suspect I was mislead.
    Reply to 4745454b
  31. No problem mate :) i'm always open to suggestions
    don't get me wrong with that, i was just saying if there is something not related to this topic, then talk about it somewere else not in this thread
    thanks again and sorry if i talked in a bad way :)
    Reply to Maziar
  32. No, not you. Someone else in this thread a couple of posts above this thought I was attacking your sticky. I was simply critiquing what you wrote. Seeing as you didn't feel attacked, I wonder if that means I don't have a vendetta... (don't worry, last time I'm going off topic in this thread.)

    You have a good idea Maziar. I hope you can make the changes and get this going. Feel free to PM if you need any help.
    Reply to 4745454b
  33. thanx mate :)
    Reply to Maziar
  34. Before I begin I want to let you know that I understand that SLI really shows its power in resolutions above 1680x1050. My question that I have deals with AA and other goodies that can be enabled on video settings. My question is that even though I have a SLI system with two 8800GT's 512MB and running a resolution of 1680x1050, would it be beneficial to have the SLI setup to help with the AA and such or would it be little or no difference? So is SLI specifically limited to resolution? Other then in terms of resolutions, would it help in any eye-candy goodies?
    Reply to tvh
  35. Well yes SLI will help you to use some eye-candy (AntiAliasing....)but in some games like Company Of Heroes or Crysis , you wont see a very noticeable difference between a high AA or 0 AA, well as i said SLI performs good in low resolutions too ,but it shines in high resolutions, thats why i recommend SLI for someone who plays at 1920x1200 or higher, for 1680x1050 , a single card is enough.
    Reply to Maziar
  36. Hi I am new to this so correct me if I am wrong, but I do not completly agree with your quote about that crossfire is for AMD/ATI or have I misunderstood? Asus Crosshair motherboard is amd and that is sli ready? If I misunderstood please correct me as I want to learn more about this sli and crossfire buisness as that would be my next upgrade once I understand it correctly
    Reply to DanRohland
  37. The first thing you need to remember is that AMD bought ATI. To be correct, if you want to refer to ATI cards, you should call them AMD cards. This can lead to confusion however, as you have discovered.

    CF is how you use two AMD together. SLI is how you use two Nvidia cards together. It is possible to use an AMD CPU, but have SLI enabled. You can tell which is possible by looking at the chipset. If the chipset is from AMD or Intel, it supports CF. If the chipset is from Nvidia, it supports SLI. (not all chipset, just those that support SLI/CF.) I'm not sure if Via, SiS, etc support either of these. Hope that helps.
    Reply to 4745454b
  38. Thanx 4745454b for helping me in this thread :)
    Reply to Maziar
  39. Edited
    Reply to Maziar
  40. Did you edit that post, or the first one? The first one still has errors on it. You still have the
    Quote:
    What about brands ?
    Well the brand doesnt matter , again for example , you can use XFX card with EVGA card , or SAPPHIRE card with
    DIAMOND card , just make sure they have the same memory and same clocks.

    problem, and you are still missing that ending ). You can't say to make sure you get the same memory and clocks while just above that saying memory can be halved and clock rates reduced.
    Reply to 4745454b
  41. yes they need to have the same clocks, because as i have said if they dont have the same clocks or memories then the faster one will reduce its clocks to reach the lower one

    The best option is that u use 2 cards with exact clocks, but if u dont then it wont perform good and the faster one will operate @ the speeds that the lower one has
    Reply to Maziar
  42. Quote:
    yes they need to have the same clocks, because as i have said if they dont have the same clocks or memories then the faster one will reduce its clocks to reach the lower one


    If they can reduce the clocks, then they don't NEED to have the same clocks. It is the BEST option to have exact clocks, but only because you are then getting what you payed for. Doesn't need to be, but if you want to keep your 7900GS XXX running at the speed you bought, you need another 7900GS XXX.

    The reason why I bring this up is because anyone who might read this might be in a situation. If you bought a 7900GS XXX and want to move to SLI, what if you can't find another 7900GS XXX? If I was reading this FAQ, I would be happy to hear I could buy any 7900GS, and SLI would still work.
    Reply to 4745454b
  43. Maziar - good work.....
    Reply to nukemaster
  44. Thanx, i got what u meant mate, Edited again

    thanx again for helping me so much in this thread 4745454b, i appreciate it
    Reply to Maziar
  45. nukemaster said:
    Maziar - good work.....



    Thank you very much mate :), when a guru like u talks this way , i get lots of confidence and will try more than before :)
    Reply to Maziar
  46. Maziar said:
    Thank you very much mate :), when a guru like u talks this way , i get lots of confidence and will try more than before :)

    lol no guru here.....
    Reply to nukemaster
  47. indeed u are,(no joke) i have learned lots of things from you and other guys here.
    Remember the PSU and the GPU questions i had and u helped me alot in there and i never forget when someone helps me in the problems i have(Even this PC problems ) :)
    Reply to Maziar
  48. Looked at the first post again, looks good. There are a few things I would change here and there, but no "deal breakers". Looks nice.
    Reply to 4745454b
  49. thanx again for your help mate :)
    Reply to Maziar
Ask a new question Answer

Read More

Product Nvidia Crossfire SLI Graphics Graphics Cards