AMD FX-8150 Review: From Bulldozer To Zambezi To FX

Power Consumption

According to AMD, the decisions its architects made when they designed Bulldozer centred on maximizing efficiency. In a big multi-chip module like Interlagos, squeezing the most performance out of every core under a hard thermal limit just makes good sense. The question is whether the same scalable design is as efficient on the desktop as it promises to be in the server space.

Processor
System Idle Power
AMD FX-8150 (Zambezi) 8C/8T, 3.6 GHz Base
107 W
AMD Phenom II X6 1100T (Thuban) 6C/6T, 3.3 GHz Base
114 W
AMD Phenom II X4 980 BE (Deneb) 4C/4T, 3.7 GHz
100 W
Intel Core i7-2600K (Sandy Bridge) 4C/8T, 3.4 GHz Base
90 W
Intel Core i5-2500K (Sandy Bridge) 4C/4T, 3.3 GHz Base
90 W
Intel Core i7-920 (Bloomfield) 4C/8T, 2.66 GHz Base
130 W


At idle, the system armed with AMD’s 125 W FX-8150 sips 107 W—less than the Phenom II X6 1100T, but just slightly more than the Phenom II X4 980 (both of which are also 125 W parts). Only Intel’s Bloomfield-based Core i7-920 consumes more (130 W system power).

In contrast, though, the two systems armed with Sandy Bridge-based parts drop to just 90 W (their 95 W TDPs are already 30 W under Zambezi’s thermal ceiling).

I pulled the Phenom II X4, Core i7-920, and Core i5-2500K runs off of this graph because they cluttered it up way too much. The three chips left are, in my opinion, the most relevant.

The black line corresponds to Intel’s 95 W Core i7-2600K, which averages 155 W system power use throughout a complete run of PCMark 7. Before you mention that the Core i5-2500K is closer, price-wise, to the FX-8150, know that it averages just two watts less than the -2600K, at 153 W across the entire run. Imagine that its plot would look almost identical.

The FX-8150, in comparison, averages 191 W. That 34 W delta almost exactly correlates to the 30 W separating Intel’s 95 W rating and AMD’s 125 W TDP. Even more interestingly, the Phenom II X6 1100T hits the same 191 W system average across PCMark 7. Meanwhile, the Phenom II X4 980 averages 184 W.

Intel’s Core i7-920 stands out as the one model to use more power than AMD’s new flagship. A 193 W average consumption number is 2 W higher, which we’d consider reasonable given a 5 W-higher TDP.

Create a new thread in the UK Article comments forum about this subject
This thread is closed for comments
16 comments
Comment from the forums
    Your comment
  • jaksun5
    Fuck this, I'm over your comments sections. They either don't work, are flooded by spam or I lose everything I've written and have to rewrite
    5
  • jaksun5
    OK, here it goes again... :-)

    Unfortuante that there wasn't a more competitive showing by AMD. Up until recently we could still say that performance pre dollar was still with them in alot of cases. Now it seems even that point is going to Intel for some time to come.

    One the bright side it appears that here in Oz that a new segment in the full size (14-15") notebook market in the last few months created by the release of the AMD Radeon on die processor powered notebooks in the $330-$450 space, where previously new notebooks could barely be had under $500, and even then they were powered by awful Celeron processors with even worse graphics. If AMD can move enough of these low end units then maybe they'll have a chance to improve their line up, if the talk of scaling isn't just hot air.
    3
  • bobbyp86
    Looks like I've saved myself a load of money upgrading my x4 955 this year, Bring on the 7000 series GPUs :D
    4
  • technogiant
    AMD is becoming a "promising pete", it's always jam tomorrow but NEVER performance delivered today.
    Be that with their roadmap of promised performance increases or the promise of increased performance on apu's via gpgpu applications.
    I will believe it when I see it if ever.

    I don't think they are even plan that effectively, I mean their proposed utilization of core/module parking in win8, great for power efficiency, but what about performance? For that you would need to spread the threads evenly across the single cores of each module so they don't share resources and only start using the second core in each module when the first core approaches max load.

    The implementation in win8 will only reduce performance and enhance power efficiency.
    3
  • doive1231
    I feel like the hyper-intelligent pan-dimensional beings who have just been given the answer of 42 to their question ie. disappointed and fed-up I have to wait for something better. Perhaps we should leave it to Intel to build a computer capable of finding the question.
    1
  • blubbey
    I would say I'm disappointed but it's not like we didn't know this already - surely there'd have been some 'leaked' benchmarks on the internet to promote it more if it was as good, if not better than SB.
    2
  • codefuapprentice
    I'm actually disappointed in bulldozer, i was hoping it would give intel a massive shake up like the athlon series did for a few years, as it stands i'm not gonna be upgrading from my Phenom II 955 any time soon
    2
  • das_stig
    Not the best review for AMD but look on the bright side, the prices will drop like a stone and aslong as it can play all your games at the highest resolution and all the eye candy on, without needing its own power plant and pipeline to the south pole for cooling, then why worry.

    Can we all afford these super computers sucking 1000 watts from the socket, no, I would rather wait a fraction of a second and save a few quid each month.

    Future chips may just come with a few surprises, once AMD wake up and smell the coffee.
    2
  • Anonymous
    Well common boys don't expect AMD to come out of the blue and own SB. AMD is in a very different situation, they went the GFX route awhile back and hence much of potential RnD money was taken away. Intel simply spends huge amounts of cash on their manufacturing process and micro-architecture development, which is why it's leading atm. IMSO (in my subjective opinion) Bulldozer was a strategic move intended to compete in the long run, so perhaps we will see what comes of it.
    1
  • wild9
    I really don't know what to say about Bulldozer, I've got very mixed feelings. In the meantime thank you Chris Angelini, for the in-depth analysis.
    1
  • jrtolson
    to all those expecting the bulldozer to be a "holographic chip from the future, running at 200 ghz" to have turned itslef on browsed porn for u before u got home from work? then u are fools lol (no offence)

    im pretty sure the current business models for both amd and intel are not "spend 200 squillin dollars" in r&d on making processor chips that can run the main computer of a galaxy class starship, using exotic materials (other than silicon) etc etc

    im running an amd64 3200+ single core (venice) in my rig, and it does everything and more than i want it to do.. i can play all the latest games run the most demanding software.. my point is my pc 7 years old im running windows 7 on it and it does me fine.. the market does not need nor are consumers ready for a leap in processor tech so for a business model why not realease minor improved chips and keep the dollars rolling in? than gamble everything on something that might break your company before it is even ever realeased?
    -1
  • theFatHobbit
    I was hoping this would make intel nervous and lower their prices to compete with bulldozers price/performance. but no luck.
    1
  • miklatov
    These results are a real shame. I'm neither AMD or Intel inclined, prefering to stay agnostic, but I do like healthy competition (It works well for us buyers, right? :D) and this offering just doesn't really cut it on performance or price.
    0
  • dillyflump
    Have to say i'm a little disappointed at the raw power per core of these FX chips in games, but i'm pretty sure the intel sandybridge and other core i7's are out in front due to hyperthreading on each core. World of Warcraft is programmed to only use two physical cores, but the intels get around it with hyper threadings 2 extra logical cores to process on. If game engines were better programmed to actually work on a cpu's physical cores and not logical ones i'm pretty sure the FX chips would beat the sandybridge processors. Perhaps the tested could look it up, but last year I was reading an article on how to force the warcraft engine to use multiple cores not just the 2. Looked complex to do but having ordered a bulldozer FX 8-core and a new 990FX board i think i'll try and get this work around to use all the chips power and see what results i get teamed with crossfire 6870's
    0
  • HEXiT
    lolz... seriously m8 try to at least understand... the 2500K doenst support hyper threading so how can it be out in front because of it... AMD promised the world a cpu that could compete with intel's latest and they delivered 1 that can compete with there last gen only. as for you being pretty sure... well im pretty sure you think you know a lot more than you actually do, and its gonna cost you a fair bit of cash...

    not only does the part not perform consistently and never will in a gaming environment. its power inefficient to the tune of 180+watts. seriously guy rethink your choice... you would be no worse off performance wise buying a P'II 970 and waiting for the next iteration that will still underperformed against intels ivy bridge...

    as for your theory on how WOW is processed your off the mark there too.. intel only use hyper threading when a game/application asks for it. on a single core wow will use hyper threading (if available) as i needs 2 Cores to work best, on a dual core it will use 2 cores without hyper threading and on a quad it will use 2 cores without hyper threading. just because a core shows 75 percent usage doesn't mean its using 25 percent hyper threading.
    case in point wow performs no better on an intel 2500k than it does on the intel 2600k 1 has hyper threading the other doesnt.

    seriously m8 i aint trying to be a jerk, but it defiantly looks like you have a case of "thinks he knows"... you seem to be operating on assumptions about intel rather than fact... use places like wiki, toms, hardware secrets and other places to get the rite info b4 you make a misinformed choice.
    0
  • Anonymous
    But between the i5 and i7 which is best on cost vie performance to be honest i have not look at an AMD chip based PC in years, why would you? and based on the excellent review / bench mark i will not be changing my mind for some years to come.
    0