Sign in with
Sign up | Sign in

Benchmark Results: Crysis

Nvidia's GeForce GTX 285: A Worthy Successor?
By

Even after all this time, Crysis continues to punish graphics cards with high details and minimal optimization. While most of the world is more interested in the more efficient Crysis Warhead version, inclusion of this older title in our GTX 295 Quad-SLI evaluation caused it to reappear in today’s review.

While the test appeared to run smoothly on the GTX 285 at our lowest-tested Crysis settings, occasional stutters would get the player fragged in a real game. The only good option for a single-GPU card would be to reduce detail levels, although the dual-GPU GeForce GTX 295 could make 1920x1200 pixel game play possible for buyers with more discretionary income.

Players can forget about using very high details with AA and AF enabled in Crysis, as even the dual-GPU cards suffered enough stutter to cause an occasional surprise ending. The GTX 285 edged out the GTX 280, but with all cards producing unplayable frame rates, this win is purely academic.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 15 comments.
This thread is closed for comments
  • 1 Hide
    waxdart , 29 January 2009 19:03
    Let me know when we can run Crysis in HD full detail on a mid range card that doesn't need its own powerplant. Till then, I'm keeping my money.
  • 0 Hide
    Stupido , 29 January 2009 19:26
    Good point... ;) 
  • 0 Hide
    s3k3r , 29 January 2009 21:06
    hm.. they should try to OC it... see how much more you can get than GTX 280... Otherwise you dont want to spend fifty dollar more for 2-8 FPS more
  • 0 Hide
    Anonymous , 30 January 2009 00:30
    Why does this article seem to think http://hubpages.com/hub/Geforce-9-Series
    that 9 series is better than the gtx?
  • 0 Hide
    wild9 , 30 January 2009 00:30
    They should offer degree courses in computer graphics card revisions. Add another one to the list of cards that are trying to find fit into a niche than invent one.

    I'd also expect much better power-savings from the 285, given smaller 55nm die (yep, even with overclocking).
  • 0 Hide
    Anonymous , 30 January 2009 01:47
    "Noticeable improvements make the GTX 285 a good solution for new systems, but its value as an upgrade part is purely dependent on the inadequacy of the part it will replace."

    What kind of nonce poofy queer way of speaking is this?



  • 0 Hide
    goozaymunanos , 30 January 2009 08:27
    still got my eye on a 295..i think that's the way to go..

    check it out:
    http://www.tomshardware.co.uk/geforce-gtx-295,review-31470.html


    cheers,
    bill

    p.s. stuff and nonsense: http://www.eupeople.net/forum
  • 0 Hide
    Bitty , 30 January 2009 10:23
    Why not run them at the same speeds? Try the 208 at 648MHz and leave the 285 at its stock 648... Most 208s will easily do 648. I have just bought a 208 for £241 and it's happily doing 702. Why buy a 260 at that price for a 280? Why spend £300 on a 285 for a few fps that will not be noticed anyway? I hope, though, they keep the 280 around for a time so I can buy another for SLI at hopefully less. They won't though since it's too close to the 285. The 295 though is a different beastie and the real upgrade for 280ers but it's steep.
  • 0 Hide
    Anonymous , 2 February 2009 02:49
    I agree with TRibal GFX. get to the point man.
    Whats the bottom line. I aint interested in 5% increase.
    Im interested in new hardware giving me 30% increase in performance.
  • 0 Hide
    daglesj , 2 February 2009 17:50
    Looks like a waste of manufacturing effort to be honest.

    Talk about milking it for all its worth.
  • 0 Hide
    AGTDenton , 4 February 2009 17:35
    This seems to be a waste of time to the consumer. The Power Efficiency of 5 watts in idle is unlikely to sway any punters.
    They could have at least added GDDR5?? And also ATi are planning a 40nm die for this year, nVidia should have skipped 55nm.
  • 0 Hide
    hiphipphippo , 5 February 2009 20:47
    Yeah, I'm surprised they bothered with 55nm.............
    40nm/45nm is pretty well mainstream now. Can get nearly twice the number of transistors on a 40nm design compared to a 65nm design. Power consumption per transistor at the same clock speed is roughly halved as well.
    So at 40nm, a GPU design twice as complex as the 280 and clocked at the same speed should consume about the same power..............
  • 0 Hide
    king_scruff , 25 February 2009 23:39
    I have this set up:

    Intel Core i7 920 2.66Ghz (Nehalem)
    Akasa AK-967 Nero Direct Contact Heatpipe CPU Cooler
    Asus Rampage II Extreme Intel X58
    Corsair 6GB DDR3 1600MHz Triple Channel DDR3
    Asus GeForce GTX 285 1024MB GDDR3

    Why am i not getting similar results?
  • 0 Hide
    Mjaffk , 28 February 2009 20:43
    Maybe yours i7 is on 2.66? The i7 in test is at 4.00.
  • 0 Hide
    mik52 , 7 March 2009 19:37
    i7 2.66Ghz with no overclock made
    Asus p6t
    Corsair Dominator 6G 1866Mhz
    Asus gtx 285
    Velociraptor 300G

    3d mark vantage score: P14580
    Cpu Score : 45130
    Gpu Score : 11896

    in this review the cpu is overclock but my results almost the double how can this be possible?