Sign in with
Sign up | Sign in

Larrabee Versus Itself?

Larrabee: Intel's New GPU
By

At first glance, Larrabee seems to come out on the winning end of comparisons with its direct competitors. More parallel than the Cell and more flexible than a GPU, it seems to have everything going for it. But let’s keep our excitement in check. A product that exists only on paper necessarily has all qualities and no defects. For a long time, the Itanium seemed to be the future of processors before one of its flagrant faults became painfully obvious: while it’s not easy to reorganize a program’s instructions dynamically on the processor, it’s just as complicated to do it statically in the compiler.

So it’s not a good idea to blindly accept each announcement of a new technology. In certain forums, for each question concerning rendering or a particular algorithm, somebody will eventually mention Larrabee as the ideal solution, which is ridiculous for the time being. Larrabee won’t suddenly solve all problems of real-time 3D, though it should make some good progress possible.

Larrabee undeniably has the benefit of positive hype, and Intel needs to take care that that doesn’t result in blowback if the first generation of products doesn’t live up to expectations (the Merced syndrome).

In a few days, we’ll be giving you a look at Larrabee’s thorny software side, so stay tuned.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 13 comments.
This thread is closed for comments
  • 0 Hide
    horendus , 23 March 2009 14:58
    When push comes to chase, who thinks it will outperform my 8800gt?

  • -1 Hide
    caskachan , 23 March 2009 17:49
    hrm.. first time i ever do TL DR to a article, sinc eit sfuckign intel i DONT belive it will perfrom, but i hope it DOES and smacks me behidn the head with super fast performance

    (btw great article im sure you explained why it might or not kick ass)

    also i own an intel cpu, but i DO not belive intel witll get it right (at first)
  • -1 Hide
    Anonymous , 23 March 2009 19:30
    An earlier article mentioned this is initially aimed at replacing the onboard graphics parts Intel uses i.e. GMA950, etc. So don't expect it to come anywhere near an 8800gt, at least not initially and perhaps not ever as that may not be the point of it. Perhaps Intel are just thinking of a super cheap onboard graphics part that can be updated to match any future DX/OGL spec or video codec with just a patch as well being available for accelerating specialist apps that need number crunching.
  • 1 Hide
    americanbrian , 23 March 2009 21:04
    fezztah,

    If you read the article it is more more expensive to produce than the cell. So really I don't think that hoping it outpaces the now old 8800GT is a lot to expect.

    Also it mentions that this is intels stab at DISCRETE graphics, meaning not on a motherboard.
  • 1 Hide
    wifiwolf , 23 March 2009 21:21
    "start-ups that offered first-rate 3D performance (3Dfx, Nvidia, and PowerVR) and heavy hitters who seemed to think that 3D acceleration was just a gadget (Matrox, S3, and ATI before AMD purchased it"
    I have to disagree on this as Matrox, before going business only, was high-end.
  • 1 Hide
    Belinda , 23 March 2009 23:09
    Can't say i have much faith in Intel producing a good graphics solution.

    Maybe this will just end up as an addin card for applications that require parallel processing. Will they be able into interconnect like SLI???

    Ok the addin card is just an idea but surely from what i read for some applications it could be an ideal solution. Wouldn't it work well for decryption like the Nvida cards have been used?

  • 1 Hide
    core i7 ownage , 23 March 2009 23:33
    If it's PCI-E then I might buy it. Or if my new mobo works with it.
  • 1 Hide
    Helloworld_98 , 24 March 2009 01:55
    @Belinda, doubt the dual gpu idea will get to intel.

    @core i7 ownage, it'll probably be pci-e but with an x86 base(quickly reminding everyone how the 1ghz snapdragon comes close to the atom)it won't perform great, unless it's basically an i7 cpu on a card because then you'd have tesla league power.
  • 1 Hide
    wild9 , 24 March 2009 23:44
    Got mixed feelings about Intel's hybrid solution on account of its x86-based cores:

    I don't know if Intel is capable of pulling this off, since there's a lot more to it than simply having a better product on paper; Betamax was better than VHS but look what happened in the end. Its one of several solutions, some of which are already in mass production.

    I see nothing stopping say AMD or nVidia taking this design and adding their own tweaks, in the meantime using their discrete technology to accelerate applications that are in dire need of a speed hike. Ask anyone who uses video transcoding software: do you care whether it's 15x or 50x faster..whether it's AMD, Intel or nVidia..as long as it's available now, and at the right price?

    Keeping all the options open was a good idea on the part of AMD, since competing directly with the fastest Intel Core i7 architecture isn't going to be the future of desktop computing.

    Thanks for this interesting article, Abi-Chahla. There is a lot of information to take in (some of which is beyond my understanding), but it helps put things in perspective.
  • 1 Hide
    plasmastorm , 25 March 2009 08:43
    Is it just me or is that an AGP card in the picture?
    I Lol'd.

    If Intel stick with it and do produce a quality piece of kit down the line maybe Nvidia and AMD will take note.
    Competition = lower prices, that's the most i can hope for from Intel with this card to be honest.
    Like most people who play games I doubt this is going to be worth looking at when it's launched. Even compared to other cards in the same price range.
  • 1 Hide
    digriz69 , 25 March 2009 22:52
    In my opinion its not really the chipset, gpu, or cpu in this day an age, there all super capable of much much more than current programming capacities give credit. Give any instruction set a first glance and give the actual use of instruction sets a glance and the lists parallel themselves.
    The fact is, there not even usen a grain of sand in comparison to the capabilities of the current platforms in any direction you go in, but to think that an instruction set so used and abused as x86 can deliver anything new to the programming angle of the equation,,,, I just dont see it.
    Flexible, yes, manageable, definitely, a solution, no. GPU is my opinion of a solution within itself. It is capable of so so much more than it is currently used.
  • 0 Hide
    simonmw3 , 27 March 2009 16:19
    Why no mention if AMD+ATI Fusion?

    AMD is leveraging its acquisition AIT to combine CPU and GPU on one chip. This could give advantages over an Intel CPU + any add-on graphics card. Who knows how any of this will work out, but my guess is that Intel has realised that is cannot afford to be left out in the cold if the AMD Fusion chips prove to be very successful. Therefore, Intel is getting into the graphics market with a very CPUish graphics chip to cover itself.

    In any case, competition can only be good for the consumer.

    P.S. I thought what happened a decade ago with Intel graphics chips was that Intel was hoping to re-used older fabrication equipment to make the chips, but what happened is GPUs used the latest, smallest dies so the Intel strategy just did not produce anything competitive on their old fab gear.
  • 0 Hide
    gemmakaru , 30 March 2009 22:45
    This is Intels stepping-stone towards real time ray tracing, this will require many x86 cores. This is Intels way of providing them.