Sign in with
Sign up | Sign in

StarCraft II Revisited: How Much Gaming PC Do You Need?

StarCraft II Revisited: How Much Gaming PC Do You Need?
By

When Gigabyte suggested that we review the performance of StarCraft II on an all-Gigabyte graphics card lineup, we were delighted. We wanted an excuse to revisit the game, even though we had performed a thorough performance analysis of the StarCraft II beta a few months ago.

While the game engine hasn’t changed much between our beta review and the final release, we weren’t especially satisfied with the benchmarking method we were forced to use at the time. This is because the only consistent way to benchmark the beta was by playing back a saved game. This involved watching a movie of game play that had previously occurred. While this test did stress the graphics engine, it wasn’t ideal for measuring real-world performance. In an actual game play scenario, the system is forced to calculate variables in real-time. Playing back a saved game with a predetermined outcome doesn't generate the exact same processing load.

The release of the full title allows us to create a more realistic simulation. The bundled StarCraft II Map Editor gives us the ability to build a map pre-populated with multiple simultaneous battles involving all three StarCraft races at the same time. Now that the computer has to perform all of the necessary AI calculations, instead of simply playing back a movie with a predetermined outcome, we have the opportunity to perform a worst-case scenario stress test of the game’s ability to push PC hardware to its limit.

In addition, AMD released the Catalyst 10.7 beta driver that supports anti-aliasing in StarCraft II, so we can see how Radeon and GeForce cards compare with this graphical enhancement enabled. Of course, between then and now, AMD made its Catalyst 10.8 package available as well, wrapping in the improvements introduced in the hotfix driver.

With all of these considerations in mind, it's a good time to revisit StarCraft II, post-release. Let’s start by looking at the hardware we're using to benchmark this game.

Display all 8 comments.
This thread is closed for comments
  • 0 Hide
    darksai , 17 September 2010 15:57
    I think the requirements are a bit overstated.. I feel the test is too artificial, as worst case rts is very different from worst case fps, cos in fps a smoke grenade could off at almost any time while super large battles like this only happen a) in very late game and b) with more than 2 players. Another factor, especially for cpu performance, is that ai is more taxing than a real player as the ai's apm is far greater than a humans, thus those units need do a lot recalculate a lot more often. So if one is only playing multiplayer, more particularly 1v1, the results would be very different. Also, even in larger games, big battles will typically have much fewer units because of supply limit and the fact higher tech units cost more food, resulting in much less calculations (compare how many zerglings you could have for one ultralisk for example)
  • 0 Hide
    darksai , 17 September 2010 16:01
    This may be "worst case" but is far from "real world" imo
  • 0 Hide
    Gonemad , 21 September 2010 19:53
    Spoiler

    The campaign offer a map with the perfect script for this sort of testing.

    It is a map offered by the Zeratul crystal, where you control the Protoss race, and you must survive at least 1500 units of a Zerg-hybrid-protoss onslaught. Enable god mode on the player side, and let it loose.

    The game itself eventually will recommend to lower the graphical settings. The script seems to spawn an infinite amount of enemies, since it is by definition a no-win scenario, and you must eventually die. I don't know why you didn´t use that. The script seems pretty predictable and repeatable to me. And, since the player is in god mode, the onslaught will keep going until some stack overflow happens, or it will keep going forever. Otherwise, as more units die, the AI speeds up and fps should increase as more units are destroyed.
  • 0 Hide
    Anonymous , 25 September 2010 02:42
    Second Gonemad's idea and my computer being a 2P Opteron 2376 2,3Ghz with 16Gb 667Mhz DDR2 and a 1Gb 9600GT Nvidia (passive cooled) runs the game without hickups on 27 to 25 FPS according to the gameinterface on Ultra on 1920x1080 on all levels except 2 being the one mentioned above and All-in (the Final)where I get stuck at 19 to 21

    So I you say the 240 and 260 give these bad figures I wonder why they are going with those energy hoggers and Me with my 75Watt card (on peak) am doing good to great.
  • 0 Hide
    sirkillalot , 27 September 2010 19:22
    im happy enough with my 5870 results

    ps whats starcraft ?
  • 0 Hide
    Phoenixlight , 9 October 2010 06:14
    It's a game.
  • 0 Hide
    sirkillalot , 9 October 2010 18:48
    i no i was kidding
  • 0 Hide
    Avro Arrow , 20 October 2010 20:50
    Starcraft II is a weird game seeing as how it seems that it performs worse with multi-GPU setups. Thank god it hasn't affected me that way. Gameplay has been smooth as silk. I'd actually like to see Rome: Total War used as a gaming benchmark because with all the movement of the military units, a card will be pressed not to lag. In addition, the sheer scope of these massive battles uses up a ton of video card RAM. I'd be interested to see benchmarks on that game, even if it is on the old side. :sol: