Sign in with
Sign up | Sign in

Benchmark Results: Left 4 Dead

GeForce GTX 285 Gets 2 GB: Gigabyte's GV-N285OC-2GI
By

Left 4 Dead is a throwback to the grainy, gritty worlds of earlier games, but with somewhat-smoother round edges and slightly finer detail levels. Its popularity, rather than its system load, puts the game in today’s benchmark suite.

Even with a Core i7 at 3.83 GHz, Left 4 Dead appears CPU-limited at all resolutions for the GeForce GTX 295, while single-GPU cards start to run out of power when pushed beyond 1920x1200.

Adding 4x anti-aliasing and 8x anisotropic filtering allows us to see the beginning of the curve, where the GeForce GTX 295 finally starts to reach its limit at 2560x1600. Neither of the single-CPU cards can play the game with complete smoothness at the highest resolution.

The 2 GB GeForce GTX 295 doesn't demonstrate the benefit of added RAM in Left 4 Dead, not even at 2560x1600 and 8x AA plus 16x AF.

Display all 29 comments.
This thread is closed for comments
  • 0 Hide
    piphil , 5 August 2009 15:36
    And here I am still running games at 1280 x 1024. I must be behind the curve...
  • 4 Hide
    chriscornell , 5 August 2009 16:00
    I run most games at 1920x1080 and usually with 8xAA 8xAF if available. The only game that eats up all my GTX260's 895Mb memory is GTA4 - This is because GTA4 is the worst console-to-pc port ever.
  • 0 Hide
    Anonymous , 5 August 2009 18:29
    "You won’t find a higher-end graphics processor than Nvidia's GeForce GTX 285"

    Except for the Nvidia GeForce GTX 295, of course...
  • 2 Hide
    roobubba , 5 August 2009 18:39
    Anonymous"You won’t find a higher-end graphics processor than Nvidia's GeForce GTX 285"Except for the Nvidia GeForce GTX 295, of course...


    Just to clarify this, clearly the 295 isn't a 'single GPU', but especially with the 295 now being modified so both GPUs are on the same board, it's a rather misleading statement! Yes, I might be nit-picking, but small details like this detract from the article.

    Roo
  • 2 Hide
    darkproject , 5 August 2009 18:41
    i am running games at 1680 x 1050

    if you want a good graphics card get the Geforce GTX 260. its cheap and its good. but the high end you should get the Geforce GTX 285 Or GTX 295.
  • 2 Hide
    roobubba , 5 August 2009 18:47
    Quote:
    Gigabyte produced a great product, but the overall value question appears to concern bragging rights. How much would you pay to be able to tell your friends you can play the latest titles with everything (including resolution) maxed? For those who'd answer "a lot," Gigabyte's GV-N285OC-2GI is a solid path to the best performance possible.


    Argh, no I'm sorry I can't let this slide. You've just put a whole load of effort to generate some very reproducible results which clearly show that there is no benefit whatsoever, barring two marginal cases at very specific settings, to 2GB on this board instead of 1GB. The extra money would be far better spent on a GTX295 or sticking two 275s in SLI, for example. Or, dare I say it, two 4890s in crossfire... Your own data show that this board is a waste of money, so please don't conclude that it's a 'solid path to the best performance possible'!! That's simply not true!

    Roo
  • 0 Hide
    madogre , 5 August 2009 19:40
    Can we get SLI 2gig vs 1gig bench marks, no one seams to have any results of the 1.8 gig 260s and 2gig 285s in SLI vs the normal cards.
    I think this is were we should see the extra frame buffer show its usfullness.
  • 0 Hide
    Reynod , 5 August 2009 21:39
    Yes ... perhaps that is an interesting idea well worth checking.

    2 in SLI vs a 295.
  • 1 Hide
    LePhuronn , 5 August 2009 21:42
    And why oh why have Gigabyte swapped a DVI for VGA? If I'm shelling out this much for a high-end card then I'm likely to be running purely digital interfaces - surely it would've made more sense to stick with two DVI and ship an adaptor for the stupid fools who pay this much for a GPU then use analogue displays with it.
  • 0 Hide
    LePhuronn , 5 August 2009 21:46
    Anonymous"You won’t find a higher-end graphics processor than Nvidia's GeForce GTX 285"Except for the Nvidia GeForce GTX 295, of course...


    The GTX 295 is a pair of 275s on a single board, so the original statement is correct. The GTX 295 is the top-end graphics CARD, the GTX 285 is the top end graphics PROCESSOR.
  • 0 Hide
    wild9 , 6 August 2009 01:20
    Wouldn't mind seeing this card against a fast Radeon card with GDDR5 memory.
  • 0 Hide
    Anonymous , 6 August 2009 17:05
    Why have they included VGA? Does anyone sit with twin monitors and have one DVI ad one VGA still? If your spending this much on a graphics card you expect the best - i.e DVI... they could have stuck to the adapter system for people who still wish to use blurry poor quality vga screens. Whats next Gigabyte? DDR1 RAM back from the dead?

    and Wild9 - is there such thing as a fast Radeon.. i know they come with the new memomry...but if that was so special why havent the masters NVIDIA adopted it anywhere?
  • 1 Hide
    Anonymous , 6 August 2009 17:07
    anyway... DX11 at the end of the year... don't buy any cards yet :) 
  • 0 Hide
    wikkus , 6 August 2009 17:08
    roobubba so please don't conclude that it's a 'solid path to the best performance possible'!! That's simply not true!Roo


    I think you may have misinterpreted the summary; I read the concluding paragraph to mean "if you've more money than sense, you're going to love this".

    Re-read it and see what I mean -- the reviewer states: "but the overall value question appears to concern bragging rights"

    ... and then: "How much would you pay to be able to tell your friends you can play the latest titles with everything (including resolution) maxed?"

    ...followed by: "For those who'd answer "a lot," Gigabyte's GV-N285OC-2GI is a solid path to the best performance possible.".

    In other words, if having the latest doo-dad is more important to you than usable, tested performance, then kncok yourself out. I meet these people a lot in this industry and usually, when asked the question "But what's the advantage?" they answer is generally, "Well, duh! It's more, isn't it?". At this juncture, I usually count silently to ten whilst envisioning them being flailed alive with sharpened DMI cables.

    If you're still unclear on what's being alluded to, look up "But this one goes to 11". If you fail to get it after that, you are one of the people referred to and can PM me for a flailing session.

    R.
  • -1 Hide
    unknownsock , 6 August 2009 17:19
    mattbdj...but if that was so special why havent the masters NVIDIA adopted it anywhere?


    Nvidia Masters? yeah right, overpriced hardware yer.
  • 0 Hide
    Anonymous , 6 August 2009 17:20
    In other words, if having the latest doo-dad is more important to you than usable, tested performance, then kncok yourself out.

    unfortunately mate - dx 9 is the only real tested proven performance. DX10 is a mess with vista, and im expecting the same with 11.

    Tested or not though - 10 looks far superior than 9 - so ill live with a few minor bugs for the improvement in graphics :) 
  • 2 Hide
    Anonymous , 6 August 2009 17:22
    unknownsockNvidia Masters? yeah right, overpriced hardware yer.


    theyre not that overpriced - i'd start pointing to my business where I sell them but will prob get banned from the forum.

    No more than the radeons... and less heat problems. And the ATI catalyist software is awful
  • 0 Hide
    wikkus , 6 August 2009 18:21
    wikkusIn other words, if having the latest doo-dad is more important to you than usable, tested performance, then kncok yourself out.


    mattbdjunfortunately mate


    I am not your mate: I am not (thankfully) partnered with you nor am I your buddy...

    - dx 9 is the only real tested proven performance. DX10 is a mess with vista, and im expecting the same with 11.Tested or not though - 10 looks far superior than 9 - so ill live with a few minor bugs for the improvement in graphics


  • -2 Hide
    unknownsock , 6 August 2009 18:25
    mattbdjtheyre not that overpriced - i'd start pointing to my business where I sell them but will prob get banned from the forum. No more than the radeons... and less heat problems. And the ATI catalyist software is awful


    Fact is Nvidia is way more expensive, look at the 4870x2 and the gtx295, yes the gtx295 is better but for another £100+? no. Generally there atleast 20% more than the equivelent ATI cards.

    And Who cares if the CCC is crap? Use third party software.

    Its a load of fanboy bull.
  • 0 Hide
    Fox Montage , 6 August 2009 19:32
    Who uses VGA ports / CRTs anymore? Hardcore gamers who want refresh rates of 75Hz+ and zero ghosting / input lag, the very people who would probably spend money on a high-end card to guarantee a high minimum FPS.
Display more comments