Sign in with
Sign up | Sign in

AMD Fusion: How It Started, Where It’s Going, And What It Means

AMD Fusion: How It Started, Where It’s Going, And What It Means
By

You've already read about APUs, and maybe you're even using them now. But the road to creating APUs was paved with a number of struggles and unsung breakthroughs. This is the story of how hybrid chips came to be at AMD and where they’re going.

“Nothing is more difficult than the art of maneuver. What is difficult about maneuver is to make the devious route the most direct and to turn misfortune to advantage.”

    —Sun Tzu, The Art of War

When I interviewed Dave Orton, then the president of ATI Technologies, in 2002, one of the first things he told me, "It’s always what’s possible in the business that keeps people going." No more prophetic words could describe the coming merger between his company and CPU manufacturer AMD. The big question, of course, isn’t what is possible or even whether the possible will become reality. The real question is whether the possible will become reality soon enough.

Orton spent most of the '90s with Silicon Graphics and, in 1999, when almost anything in technology seemed possible, he left SGI to join a little core logic startup called ArtX. The little company won the development contract for Nintendo’s GameCube, which went on to sell a few units (somewhere north of 20 million). That fall, ArtX showed its first integrated chipset at Comdex, and immediately the company flashed on the industry’s radar as a prime acquisition target.

Ultimately, ATI was able to put ArtX in its pocket and made Orton its president and COO. Then the tech bubble burst, driver problems abounded, schedules slipped, and, for a while, it seemed that ATI could do nothing right.

Part of the road back to glory hinged on Orton figuring out how to complete the meshing of these two development teams. He was the one who figured out how to get ATI on a 12-month cycle for new architectures and six- to nine-month cycles for iterative design revisions. Product teams were given more control and responsibility. And slowly, over 18 months, perhaps, with Nvidia kicking it in the ribs at every turn, ATI managed to get back on its feet. The company rediscovered how to execute.

"Just step back and understand your roots," said Orton. "Constantly build. You can never be satisfied with where you are. You’ve got to be satisfied with where you can be and then drive to that."

Back on top of its game, Orton knew it was time to keep driving—but to where? I detected no glimmer of the future in our 2002 discussion. ATI continued to excel at integrating graphics into northbridge chips, and Intel, which still viewed integrated graphics as only needing to be good enough for business apps, was still more of a partner than a competitor.

However, in a keenly prescient moment, Orton told me, "I guess if I could change one thing about computing, I’d like it to be more open to create a broader range of innovation. I recognize the advantages of standards. Standards provide opportunity."

At two different points in our conversation, Orton lamented his daily Silicon Valley commute, even saying that if he could invent anything, no matter how fantastic, it would be a Star Trek-esque transporter. So perhaps we can take him at his word when, in 2007, he left his post as executive vice president of AMD in order to spend more time with his family. But this is jumping ahead. First, Orton’s drive from Toronto was about to take a hard southern turn, straight down to Texas.

Ask a Category Expert

Create a new thread in the UK Article comments forum about this subject

Example: Notebook, Android, SSD hard drive

Display all 5 comments.
This thread is closed for comments
  • 2 Hide
    womble , 14 August 2012 18:04
    Nice well written article, one of the best I've come across for a while.
  • 1 Hide
    aje21 , 14 August 2012 20:58
    Came across as a bit of a sales brochure to me I'm afraid. Also, it would be great to see AMD actually doing the efficiency side of things better than Intel. It's hard to make a case against a product which is faster and uses less power.
  • 1 Hide
    meichunr , 14 August 2012 21:25
    The idea is good and I very much like to thank you find that you are a legend ...
  • 0 Hide
    kulwant , 18 August 2012 00:08
    Whilst the theory is fine, the problem with AMD's approach is, execution is not fast enough. Intel chips capable of speeding up parallel computing tasks (like video transcoding) have been sittng in people's machines for over a year an a half now already.

    And those chips have been demonstrated to be capable of doing theses tasks faster than AMD/nVidia GPU's with their supposedly faster parallel compute performance.


    So while we wait for the developers to adopt HSA and push products out to market before we can see if the new AMD chips can do it any faster than Intel chips, the answer looks like it's a foregone conclusion already, Intel's been doing it faster for over a year and a half already and what's to say they won't have something even faster out there by the time AMD's time of glory supposedly arrives?


    As the time old adage goes, you can have the fanciest hardware around, but it's no good if you haven't got any good software to run on it. And I've yet to see any killer apps that really exploit that Terafloops of performance AMD GPUs allegedly deliver.

  • 0 Hide
    tijmen007 , 23 August 2012 17:52
    I believe that in the last paragraph in the "Up from the ashes" chapter it is supposed to say "Rory Reed", instead of referring to him as "Read". Mistake is made a few times ;)