Sign in with
Sign up | Sign in

AMD's Fusion Cloud Could Hurt GPU Biz

By - Source: Tom's Hardware US | B 4 comments

How often do you upgrade your graphics card to play the latest first person shooter games? Every six months? Every year?

Graphics cards are the focal point in a system for gamers. A GPU is the break or make point for a high-end game to be playable. This in essence, is what AMD wants to address with its Fusion super computer. AMD wants to be able to deliver a game to users on any computer. Because of the way Fusion delivers graphics--essentially pre-rendered and them streamed over the Internet--gamers will be able to enjoy all the latest games, no matter what GPU they have in their system.

While this makes perfect sense for the consumer, where does AMD's graphics card business stand in this situation?

With a really good Internet connection--net connections in the U.S. are horribly behind--you can enjoy the latest games, even on a GPU that can't even handle 3D rendering. Time to whip out that old ATI Mach 64 card.

But Fusion is more than pre-rendered games on a super computer, it's about integrating both the GPU and the CPU into one chip. But this too presents an issue for the gamer. Why would a gamer want to have their GPU tied to a CPU, when a CPU can outlast a GPU by many months and even years.

We spoke to an AMD board partner today, who wanted to stay annonymous, and it indicated to us that the future for AMD GPUs may be very limited for board partners. Once Fusion has the GPU and CPU in one chip, what will the board partners do? Will there be Fusion add-in boards on motherboards instead of CPU sockets? That could very well be the outcome. Even more interesting, Zotac, an Nvidia partner, told us that Fusion is actually great for the Nvidia camp because "no one wants to to have to upgrade both everytime; as a gamer, I want to upgrade my graphics card separately, especially when I alreayd have a great CPU."

Here's the real kicker though: both Zotac and the AMD board partner said that AMD is really going to kill 50-percent of its buiness with Fusion. Obviously, we think AMD has thought about this already. When asked if AMD plans to exit the discrete GPU business, it declined to comment on future initiatives.

As Nvidia continues to push on the strengths of its discrete GPU solutions, AMD seems to be taking a totally different route. Only time will tell how these strategies will play out.

One more thing: if AMD does turn out to make its Fusion super computer a success and gamers could always play the latest games no matter what their platform was, who would buy a GPU at all?--AMD or Nvidia.

Discuss
Display all 4 comments.
This thread is closed for comments
  • 0 Hide
    astrotrain1000 , 10 January 2009 12:15
    I was under the impression that this would be used for cells phones and such not Desktop PC's. Maybe, I just miss-read the article about Fusion.
  • 0 Hide
    Anonymous , 11 January 2009 04:18
    Simple stupid. The fusion will increase the processing power of the cpu to do more things. You could and should upgrade and have the latest GPU for gaming.

    I see this as just making the CPU more powerful to handle video editing etc. while still needing 1 or 2 graphics card for gaming. Current gaming systems have multiple GPUs anyway to handle physics etc. etc. This is no different, just one low power GPU onboard the CPU to take over some things like the maths processor back in the days - remember?

    Whats the problem? Arab Micro Devices will rule one day. You'll see the big, bad Intel follow on soon. Just you wait, they already struggling to push the 4GHz barrier when they promised 10GHz back then. All these billions spent on R&D and nothing to show for it... what a waste.

    I just want AMD to give people something innovative and not just hot wind and DRM and stopping the evolution of the monkey race!

    Bye.
  • 0 Hide
    Milany , 11 January 2009 18:39
    AMD is getting to confident about itself; AMD ATI was winning the GPU race from NVIDIA but now the QUITE. That’s the most stupid thing you can do.

    AMD ATI thinks the have outsmarted NVIDIA, but they are killing themselves. NOW AMD ATI lost the GPU race from NVIDIA.

    NVIDIA rules GPU land one again.

    ==================================
    Now you all ask WAY DOES NVIDIA WINS?

    AMD is thinking out of the box, but it is the box that makes PC gamers, PC gamers.

    It is the PC hardware that makes PC gamers, PC gamers. They do not have game consoles because game consoles limit the gamer in there gaming freedom.

    Way does a PC DVD Writhers copy illegal games? Because they sell!

    Running games externally of your own pc means you need to pay for it and we all know Games do not pay.
  • 1 Hide
    wild9 , 17 January 2009 14:14
    I'd tread with caution as regards integrating the CPU and GPU on one chip, since this industry moves so fast and AMD already has access to some of the best discrete CPU/GPU hardware around. I'd be more interested in seeing a return to the 'co-processor' socket, or a facility that would allow me to lease some super-computer resources over the net..not so much for games but for general-purpose computing. AMD is well ahead in the super-computer stakes (70% use AMD), and for good reason. So shy not just develop that corporate relationship with it's partners instead of manufacturing CPU+GPU chips that would be outdated in no time at all. The likes of nVidia and Intel would see to that.

    AMD can produce some of the best chipsets, and the best super-computers. Not to mention some cracking GPU hardware. In producing CPU's and GPU's on one chip this would suggest to me that AMD is possibly worried about the potential to make all current hardware (including i7), redundant overnight. Sure, so you can slow that process down by using hybrid processors that run hot and force people to only buy certain products at certain times, but that in my opinion would be a terrible mistake. The future is GPU processing because a) nothing can catch it and b) it's relatively easy to design and manufacture, and c) it's immensely scalable. I'd rather see dedicated GPU hardware as well as improved connectivity (whilst maintaining net neutrality).