Single & Heavily Threaded Workloads Need Not Apply

Remembering what these two hotfixes actually do, the only hope for performance gains comes from running workloads that are neither single threaded nor heavily threaded. To confirm that there are no gains at either end of the spectrum we first turn to Cinebench, a 3D rendering test that lets us configure how many threads are in use:

Cinebench 11.5 - Single Threaded

Cinebench 11.5 - Multi-Threaded

With one thread or 8 threads active, the FX-8150's performance is unchanged by the new hotfixes. I also ran TrueCrypt's encryption/decryption benchmark, another heavily threaded test that runs on all cores/modules:

AES-128 Performance - TrueCrypt 7.1 Benchmark

Once again, there's no change in performance. Although you can argue that CPU performance is most important when utilization is at its highest, most desktops will find themselves in between full utilization of a single core and all cores. To test those cases, we need to look elsewhere.

The Hotfixes Mixed Workloads: Mild Gains
Comments Locked

79 Comments

View All Comments

  • b_wallach - Monday, January 30, 2012 - link

    One problem this cpu has had from day one has been it's overpriced by newegg and others. AMD wanted it sold for around 240.00. Because of the high demand and low volume they jacked the price up because they could "get away with it"
    If you put it's cost where AMD wanted it to be the picture becomes a bit clearer.
    I keep getting the feeling that all anyone would want AMD to do is make a Intel clone to perform equal to intel's cpus. That is stupid and backward thinking. Innovation takes guts because there is more chaos attached to it but that is how we step forward into new and better things. Without it we start to just stagnate.
    I am more inclined to go with AMD because they are willing to innovate, to take a chance to make a cpu from a different angle.
    And while this can be troublesome as we now see and will have growing pains I'm going to keep up hope that when it's software issues are worked out and some of it's latency is also worked out with the next gen at least they are trying to make a better item instead of just being a Intel drone company.
    And like some more long range views have pointed out, while it's having issues in some area's it excels in some others like multitasking or running more than one or more programs at the same time. No more programs going crazy when something like Norton's decides to run a virus check in the back ground ect..ect..
    If all people want is clones than buy only Intel if you want.
    But if you manage to talk everyone out there to just do this and AMD goes away don't come back here crying about how much your next computer will be costing you because it will be a LOT more expensive and new cpu's from Intel will shrink to a trickle as there is no longer a need for them to do so.
    This is a good reason alone to support AMD no matter what they make.
    The whole AMD only or Intel only is such a load of crap in the long run. Both work well with current software. The real issue is will AMD be able to survive and if not will they take ATI with them as well?
    If so the whole computing landscape will change to a very dark and nasty place. Intel and Nvidia would love it, the rest of the computing world would suffer total despair. It's easy to loose a business, it's next to impossible to start it back up again so if AMD keeps getting trashed by the media and people fall into that line the outcome will not be a pretty one.
    It's almost too obvious that the bulk of the people here weren't around or old enough to remember when it was just Intel and their 286 and 386 days. They cost a ton of money back then, computers were not cheap. It gave Apple and even Atari some business and AMD started to come alone as well and the cost slowly came down because they could survive in that kind of setting. But times have changed way to much for that to be repeated.
    It's too bad more people don't think along the what if line so they will know what to look forward to in a best or worse case scenario.
    It's one reason why I'll use a AMD cpu. I can run anything I want to just fine and do it at a reasonable price. And it's mixed with a little thank you for getting prices down to affordable levels.
    That is worth supporting.
    I keep getting the feeling that people think of these as "I have the fastest computer on the block, look at me", the keeping up with the neighbor scenario that is funny in the best of times and horrible in the worst of times.
    And I don't care where that takes them in the long run because I never want to think that far ahead.
    Heck, I seldom see any long range thinking at all on the internet and that is scary. And I always thought that was just a politicians disease.
    If you could really stop and think you'd know just how much better we are today because of AMD's willing to take chances and try something new, and without them we might still be running 32bit single core cpus.
    I'm not being a AMD fanatic, just a realist.
  • saneblane - Monday, January 30, 2012 - link

    Amd first introduced a lot of the features that Intel used on Core architecture and Intel didn't mind copying them. So if even a company as big as Intel can copy, Why can't Amd. The cross license agreement between them was for this very same reason, Amd should take the best of every feature set that is used on x86 processors and make a good cpu with them, it seems to me that Amd might have too much pride for that, but speaking from a business stand point, it's the best thing they can do until they have gain more money.
  • Scali - Wednesday, February 1, 2012 - link

    The cross-license agreement is only about the use of the x86 instructionset and its extensions.
    Specific implementation details such as Intel's caching, HyperThreading, manufacturing process and whatnot don't fall under that license.
  • Scali - Friday, February 3, 2012 - link

    I was around in the 'Intel only' days as well...
    And I find it funny you say "It gave Apple and even Atari some business".
    Apple and Atari had been around in the home/personal computer market before IBM. It was the IBM PC that was trying to take business away from them, not the other way around.

    I'm not so sure how much AMD contributed to getting cost down, if the cost went down at all...
    A high-end 486 CPU cost about the same when AMD released the Am386 (its first x86-competitor) as a high-end Core i7 costs today.
    I think the main differences are that the rest of the computer became a lot cheaper (monitors, PSUs, HDDs, motherboards, memory etc), and that these days, the mainstream and low-end ranges are often based on the same architecture, where in the old days the 'mainstream' alternative to a 486 would be a 386, and low-end would be 386SX or 286. So only high-end was state-of-the-art, if you were on a budget you bought older architectures at reduced prices.
    I don't think AMD had much of a share in either development.
  • Veroxious - Tuesday, January 31, 2012 - link

    My real gripe with Bulldozer is the fact that it competes with the 2500 but does not even have on die graphics. Less performance and product for your money = FAIL!
  • matt0609 - Wednesday, February 1, 2012 - link

    how come no one does reviews by manually setting the threads and memory per core, adobe after effects cs5.5 allows this, and my 8120fx renders a heavy themed project file pretty quick...
  • markmpx - Friday, March 16, 2012 - link

    There are things that BD does well, and things it does badly (like most games). Its not good for many
    windows users, but actually not bad for many linux users. Its a pity we didnt get a chip that performed
    better that SandyBridge at a lower price, but that was too much to hope for.

    Some real applications it does ok at :
    http://www.overclock.net/t/1141562/practical-bulld...

    Its sad that BD performs poorly at single threaded applications, AMD didnt quite get the mix right in this
    design and will hopefully improve it in subsequent versions of the chip. I like the fact they dont keep changing the cpu socket, while recently Intel have released 1155,1156,1366 and 2011 sockets !

    For my current applications a cheap 990FX motherboard (which all seem to have working IOMMU support)
    and a cheap bulldozer, can do much the same job as a i7-3820. Its also a nightmare to find an Intel board for a 3820 chip that supports Vt-d properly on both the motherboard and bios.

    So for things like Xen, AMD isnt a bad choice.

    We are still lucky that AMD are competing with Intel. Competition and innovation benefits us all and helps
    keep prices reasonable. (With WD and Seagate, buying Samsung and Hitachi, competition in hard drives has all but disappeared. No wonder we pay so much for hard drives now !
  • smith1968 - Friday, March 16, 2012 - link

    I have been using AMD cpu's for years, bought an FX-6100, stuck it on my 890Fx motherboard, and i noticed great improvement in gaming. I don't know why people say it's 'useless' or other negative views but for me i can play Battlefield 3 in Ultra mode 1920x1080 and get between 43 - 48 fps which is fine, that's only on high action parts of the game, sometimes jumps upto 60fps absolute max. I only have a HD6950 2GB, not unlocked or overclocked and 8GB of memory on an MSI 890FX. Paid £114 all in for the CPU, i'm not complaining, if gaming was worse than my unlocked Phenom II 550 i would of sent it back.
    Hotfix did not make any difference that i can notice, my machine is for gaming. It might not be as great as the i7 but so what, as long as my machine does the job i need it to that is all that is important, all these benchmarks that are not doing the Bulldozer cpu's any favours have not put people off, we all can't afford intel.
  • garadante - Friday, April 6, 2012 - link

    I know the architectures are very different, but the entire scene with AMD's Bulldozer architecture reminds me so very much of the scene with the CELL processor when the PS3 launched. And, the aftermath. The CELL processor had so much going for it at the beginning. It had extremely strong computational potential and pre-launch, the computational powerhouse aspect of the PS3 was it's biggest argument. In terms of raw power, it was the king of the consoles. But look at what happened. CELL is effectively dead in anything that isn't a niche server market as far as I'm aware, or concerned. It didn't matter that the architecture had incredible potential, it required a radical change in the way software was developed (game developers for the PS3). And the game developers never really grabbed the bit by the teeth. Perhaps they've gotten better at utilizing the CELL architecture now, but that doesn't matter, as all of the rumor mill is pointing towards x86 processors for -all- of the new consoles. CELL isn't even in contention, and I highly doubt that any console company is going to try to force a radical architecture change on the market anytime soon.

    My point with this is that we can't assume Microsoft will ever code properly for Bulldozer architecture, ever. Unless Bulldozer can sway over developers towards looking at it with future prospects in mind, it wouldn't surprise me to see the software industry modify there software to the bare minimum to work well enough with Bulldozer, then abandon it at the first chance. I know that comparing the processor market with the console market is like comparing apples to oranges, and even when saying the next generation of consoles are all going to have x86 processors, the market there has changed a lot in six years as well. I just want to bring up this comparison, and perhaps some other people who remember the pre-launch PS3 hype and the seemingly overwhelming advantage the console had in potential, and how that never panned out as hoped, can bring some new points to the Bulldozer conversation.

Log in

Don't have an account? Sign up now