Sign in with
Sign up | Sign in

BadaBOOM Media Converter, Folding@Home

Nvidia GeForce GTX 260/280 Review
By , Florian Charpentier

BadaBOOM Media Converter is a video transcoder developed by Elemental Technologies that converts Video DVDs (MPEG2 only) to H.264 for the major PMPs, essentially iPhone, iPod and PSP (via predefined profiles only). Optimized for CUDA (via the RapiHD video platform from ETI), it makes good use of the power of compatible GeForces (all GeForce 8 and 9 cards) to accelerate this very calculation-intensive task – which incidentally had been introduced by AMD via the AVIVO. However, Elemental’s transcoder is less buggy and has better compression speed.

gtx 260 280

With the preview version, unfortunately compatible only with the GT 200, we were able to compress our test video (400 MB) in iPhone format (640*365) at maximum quality in 56.5 seconds on the 260 GTX and 49 seconds on the 280 GTX (15% faster). For comparison purposes, the iTunes H.264 encoder took eight minutes using the CPU (consuming more power overall but significantly less on peaks). Remember, though, that it’s far from being the most optimized H.264 compressor and that BadaBOOM’s lack of flexibility is painfully obvious, even if the results are good.

Folding@Home

We also had access to a pre-beta client of Folding@Home using CUDA, whose final version should be available a few days from now, but which unfortunately was also able to run only on the GeForce 200.

gtx 260 280

Here again, Nvidia has over a year’s lag behind ATI, whose Radeons are included in the project, but the GeForce 200 (since we weren’t able to test any others) still showed superior performance. We measured our test configuration at 560 ns/day with the 280 GTX and 480 ns/day with the 260 GTX. For comparison, PS3s generally score performance in the neighborhood of 150-200 ns/day, compared to less than 10 for the processor and 200 for a simple Radeon HD 3870.

What needs to be understood, though, is that performances can easily vary depending on the optimization of the client for a given architecture (code optimizations were far from complete in the case of the ATI and Nvidia clients). Mike H feels that the same HD 3870 should be able to reach 300 ns/day, and at least 250. Another problem is that with a change of protein, which is necessary in the case of the GeForce client, performance also changes. In short, for the moment we have to stress the random and temporary nature of the results given above. What’s certain is that the arrival of a client that supports CUDA-compatible GeForce cards (all the ones from GeForce 8 on, including entry-level cards) is an opportunity for the project, since the installed base represents approximately 7,000 TFlops.

gtx 260 280

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 10 comments.
This thread is closed for comments
  • 0 Hide
    samuraiblade , 16 June 2008 21:40
    hmm not as big an improvement as i thought. will have to wait and see on the drivers improving the cards , but the 260 gtx seems to be the much better option given the price. still , will have to see what ati bring to the fray first. patience will be reflected in price i have no doubt.
  • 0 Hide
    spuddyt , 16 June 2008 22:45
    frankly depressing, Me WANTS MRAW POWER!!!!
  • 0 Hide
    JDocs , 17 June 2008 15:46
    I am so disappointed. Now if AMD delivers on the dual GPU single memory rumour (2 GPUs on a single card but without the Crossfire problems) NVidia could have a serious problem.
  • 0 Hide
    mi1ez , 17 June 2008 15:49
    Why have they tested this system with only 2Gb of RAM? If you're testing a GPU with 1Gb of VRAM, surely you'd have more installed?
  • 0 Hide
    mi1ez , 17 June 2008 16:27
    They also have 2 conflicting prices on page 28.
    For the 280GTX- $846 and $650;
    For the 260GTX- $450 and $400
  • 0 Hide
    darthpoik , 17 June 2008 20:06
    Wouldn't it have been more prudent to test against a 8800gtx ultra as this is still the single most powerfull card.
  • 0 Hide
    david__t , 17 June 2008 20:10
    It might just be me but 66.5dBa is unbearable unless you have your PC locked away in a cupboard somewhere. This business of supplying substandard fans on very expensive cards is intolerable. Why don't they strike a deal with Zalman / Thermalright for example, and ship cards that are quiet / silent? I'm sure that people who have the money to buy a £500 GPU could afford £10 more for a better cooling solution that's included.
  • 0 Hide
    Anonymous , 17 June 2008 22:26
    where is that 20W to 30W idle you are talking about? The least in the graph is 199W!
  • 0 Hide
    Solitaire , 18 June 2008 00:46
    mi1ez: Probably the reason for just 2GB RAM was that it allowed Tom's to stick with 32-bit OS architecture. If they tried using more RAM they'd be stuck with 64-bit Bindows which would not be pretty - aside from really needing 8GB to give a big difference over 2GB in 32bit Vista, there's the slight issue of stable signed drivers, which these cards probably won't have for a while. Good luck trying to get Vista 64 to even "see" the cards! XD

    jhoravi: that idle power would only come up on newer nVidia mobos as the card would be shut down entirely when idle and hand over to the integrated chip.

    And was it me or was the Noise text copypasted over the Temperature text on the next page? Oops.
  • 0 Hide
    bobwya , 19 June 2008 07:43
    Lets try again Mr THG (uhhhm try getting your fraking website working plz)...

    Now lets see this puppy in action:
    http://www.evga.com/products/pdf/01G-P3-1289-AR.pdf

    !!

    Bob

rural