Sign in with
Sign up | Sign in

Benchmark Results: SPEC

AMD FirePro V9800 4 GB: Eyefinity Meets Professional Graphics
By
SPECviewperf 11
AMD FirePro V9800
Nvidia Quadro 5000
catia-03
15.00
44.57
ensight-04
24.31
38.13
lightwave-01
49.86
65.53
maya-03
64.22
67.32
proe-05
4.85
10.83
sw-02
38.88
61.27
tcvis-02
23.43
38.86
snx-01
26.49
39.31


A heads-up battle in SPECviewperf 11 seems pretty bad for the FirePro card, even with AMD’s latest drivers. Things actually look a lot better for AMD than they did in our FirePro V8800 review, true. But Nvidia still walks away with a victory in each of the disciplines tested.

SPECapc LightWave 9.6

The interactive and render tests are run sequentially, generating both scores. The MT (multi-tasking) test sees the interactive and rendering tests executing concurrently.

Nvidia’s Quadro 5000 takes a small lead in the render and multi-tasking test, but the interactive benchmark favors AMD’s solution—something we saw in Uwe’s FirePro V8800 review as well.

SPECapc 3ds Max 9

SPECapc 3ds Max 9
AMD FirePro V9800
Nvidia Quadro 5000
Wireframe Graphics
2.09
2.04
Mixed Wire/Shade GFX
2.89
2.98
Shaded Graphics
4.33
4.53
Hardware Shaders
10.11
10.36
Graphics, Texturing, Lighting, and Blending
2.42
3.74
Inverse Kinematics
2.96
3.12
Object Creation, Editing, and Manipulation
4.00
4.03
Scene Creation Manipulation
4.52
3.75
Rendering
14.76
14.92


The CPU render and hardware shader tests are close, even if each favors Nvidia. The graphics benchmark is much more pronounced, going in favor of Nvidia’s Quadro 5000 card.

SPECapc Maya 2009

Using the latest drivers, AMD’s FirePro V9800 was unable to finish the entire SPECapc Maya 2009 suite, returning zeros for the latter portion of the test, which resulted in the CPU and I/O portions failing. Thus, all we have here is a graphics score. The good news is that AMD’s score is much higher than Nvidia’s. The bad news is that there’s no overall composite score to compare, since the FirePro doesn’t complete the test.

Now, the challenge with any of the SPEC tests is that they’re based on old versions of applications that get updated every single year. None of the latest trends in software development get taken advantage of, potentially leaving performance on the table. Unfortunately, the member organizations that make up SPEC seem to move slowly (gasp—bureaucracies are horribly inefficient?), so the snapshots of workstation graphics come with a time delay.

We were marginally successful getting a handful of other workstation-class apps running, but were amazed to find how many of these tasks are CPU-limited…

React To This Article