Nvidia GeForce GTX 260/280 Review

Over a year and a half – that’s how long the GeForce 8800 GTX remained in the position of what could be called Nvidia’s high-end GPU. Oh of course, six months after its release and – just a coincidence – just before the arrival of the R600, we did get an 8800 Ultra with slightly higher clock speeds, but it was nothing revolutionary. Then two and a half months ago, the arrival of the 9800GTX awakened hopes of substantial performance increases, but in the end the card offered only limited gains over the good old GTX, and was behind the Ultra version. For owners of these cards to be really happy with their investment, it was high time for Nvidia to offer more than a few extra megahertz or to rely on pairing two GPUs on the same card.

gtx 260 280

Finally Nvidia has heard our pleas: The GTX 280 is the first real reworking of the G8x architecture. Now we know the company’s modus operandi: Introduce a new architecture on a proven engraving process. Due to the often very high number of transistors, the chip is expensive to produce and the cards that use it remain expensive, but it stakes out the territory. Then during the ensuing years, Nvidia develops its architecture on all segments of the scale, using finer engraving, but it is less optimized for high clock speeds. Finally, when the new process is under control, Nvidia moves it into the high end, which then becomes much more affordable. We saw it with the G70/G71 and the G80/G92, and now history repeats itself with the GT200 – a true killer with 1.4 billion transistors engraved at 0.65 µm.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
Comment from the forums
    Your comment
  • samuraiblade
    hmm not as big an improvement as i thought. will have to wait and see on the drivers improving the cards , but the 260 gtx seems to be the much better option given the price. still , will have to see what ati bring to the fray first. patience will be reflected in price i have no doubt.
  • spuddyt
    frankly depressing, Me WANTS MRAW POWER!!!!
  • JDocs
    I am so disappointed. Now if AMD delivers on the dual GPU single memory rumour (2 GPUs on a single card but without the Crossfire problems) NVidia could have a serious problem.
  • mi1ez
    Why have they tested this system with only 2Gb of RAM? If you're testing a GPU with 1Gb of VRAM, surely you'd have more installed?
  • mi1ez
    They also have 2 conflicting prices on page 28.
    For the 280GTX- $846 and $650;
    For the 260GTX- $450 and $400
  • darthpoik
    Wouldn't it have been more prudent to test against a 8800gtx ultra as this is still the single most powerfull card.
  • david__t
    It might just be me but 66.5dBa is unbearable unless you have your PC locked away in a cupboard somewhere. This business of supplying substandard fans on very expensive cards is intolerable. Why don't they strike a deal with Zalman / Thermalright for example, and ship cards that are quiet / silent? I'm sure that people who have the money to buy a £500 GPU could afford £10 more for a better cooling solution that's included.
  • Anonymous
    where is that 20W to 30W idle you are talking about? The least in the graph is 199W!
  • Solitaire
    mi1ez: Probably the reason for just 2GB RAM was that it allowed Tom's to stick with 32-bit OS architecture. If they tried using more RAM they'd be stuck with 64-bit Bindows which would not be pretty - aside from really needing 8GB to give a big difference over 2GB in 32bit Vista, there's the slight issue of stable signed drivers, which these cards probably won't have for a while. Good luck trying to get Vista 64 to even "see" the cards! XD

    jhoravi: that idle power would only come up on newer nVidia mobos as the card would be shut down entirely when idle and hand over to the integrated chip.

    And was it me or was the Noise text copypasted over the Temperature text on the next page? Oops.
  • bobwya
    Lets try again Mr THG (uhhhm try getting your fraking website working plz)...

    Now lets see this puppy in action: