AMD Bulldozer / FX-8150 Gameplay Performance Review @ HardOCP

I'm still betting that Bulldozer's performance will trail off in gaming when multiple GTX 580's and ultra-high resolutions become factors.

Sure , i expect it to with sandy bridge . Your most likely going to want SB-E or ivy bridge for multiple gtx 580s .

I expect bulldozer will be fine for me with a 7x00 series single gpu card mostly because i only game at eyefinity resolutions and i will be gpu limited.
 
its most likely hyper threading slowing the 2600k down. Hyperthreading isn't an instant performance increase across the board , sometimes it does nothing and sometimes its slower .

It all depends on the work load
no, that's not the issue

the i7 2600K and i5 2500K are slower in their benchmarks after being overclocked.
 
no, that's not the issue

the i7 2600K and i5 2500K are slower in their benchmarks after being overclocked.

You do know that its an online game and the runs are limited by what actually happens during the run. There could have been an explosion or something that happened then instead of the other runs
 
Not sure how they got the number at the link you posted. [H]'s article shows BD using ~450w total power when overclocked. i7 920 was pulling over 500w.

I don't follow bit-tech. I do trust [H]'s testing methodology though.

And, an AMD fan boy I am not. All my current systems are Intel. ;)

[H]'s review OCed to 4.6Ghz, bit-tech OCed to 4.8Ghz, also different CPUs different properties. Both reviews show the same conclusion, BD's power consumption sky rockets when overclocked.
 
You do know that its an online game and the runs are limited by what actually happens during the run. There could have been an explosion or something that happened then instead of the other runs
Then it kind of throws all the numbers out the credibility window if simple gameplay can skew the results that much
 
[H]'s review OCed to 4.6Ghz, bit-tech OCed to 4.8Ghz, also different CPUs different properties. Both reviews show the same conclusion, BD's power consumption sky rockets when overclocked.


People also should remember we are looking at power usage of 4 cores vs 8 cores.
Double comparable watts usage and it looks to look perfectly fine to me.

Also, holy crap didn't realize my 920 was eating that much juice.
 
yes. are you using your brain?

are you going to stop with personal insults ?

Please note that BF3 does not contain all the features the full version will include, this means the driver multithreading might not be enabled yet. To test BF3 we found it rather difficult to get consistent runs, but we did our best to test in the same conditions each time, and perform the same actions and same path, in the same areas as best we could

hardocp already pointed out what your complaining about. Its the same thing when testing a mmorpg .
 
are you going to stop with personal insults ?



hardocp already pointed out what your complaining about. Its the same thing when testing a mmorpg .
I haven't insulted anybody. I am merely asking questions.

If the run-throughs can vary so widely that they can negate a 1GHz overclock and indeed make a CPU slower than before the overclock, wouldn't it make sense that the numbers are basically meaningless?
 
I haven't insulted anybody. I am merely asking questions.

If the run-throughs can vary so widely that they can negate a 1GHz overclock and indeed make a CPU slower than before the overclock, wouldn't it make sense that the numbers are basically meaningless?

The numbers are more important for comparing against the cpu itself.

It may actually be that faster cores matter less than more cores in this game.

It will be interesting to see what goes on with the game.
 
The numbers are more important for comparing against the cpu itself.

It may actually be that faster cores matter less than more cores in this game.

It will be interesting to see what goes on with the game.
I agree, and considering some performance optimizations that are intended for the final game apparently aren't present in the beta, I think it's too early to consider BF3 benchmarks at all relevant at this point.
 
this bf3 is BS, fx 8150 is still bad for gaming surround or not multiple graphics cards or not, just check the others 8150 reviews on other sites and i5 2500k and i7 2600k are still the first on ALL the games
 
I wish I could say that I am surprised by the crappy performance, but based on the delays and rumors I was a bit pessimistic. Although I was certainly hoping for bulldozer to at least offer some reason to perform a drop in upgrade from the 965 BE in my HTPC.
 
was there any cpu nb overclocking? on my x6 in some games going form the 2ghz to 2.8ghz helps a few fps
 
Something is very wrong with these results:

13182343781D3JFR9LiH_2_2.gif


13182343781D3JFR9LiH_2_3.gif


Is it possible you switched your overclocked/non-overclocked Intel numbers?

I just checked my data, I put the right ones in the right places, that is what was experienced. I'd like to point out the erratic nature of multiplayer gaming. Other BF3 players wanted to kill me despite the fact I was trying to do a run-through, they were very unforgiving :p

The result is, overclocking the CPU really didn't change performance in BF3 Beta Multiplayer, that is what should be taken away from that.
 
Last edited:
I'm still betting that Bulldozer's performance will trail off in gaming when multiple GTX 580's and ultra-high resolutions become factors.

This is next on our agenda to test. I'm setting up the systems as we speak.
 
This is next on our agenda to test. I'm setting up the systems as we speak.

Makes me wish I had a third GTX 580 and a Bulldozer setup to test with. I've got the monitors and everything else. I'd like to see this one for myself.
 
Looking forward to the additional gaming results, hoping to see something redeeming :/

I'm even further disappointed by what's going on with the actual pricing...220 seems to be the online price for the 8120 and 260 for the 8150. Boo.
 
I just checked my data, I put the right ones in the right places, that is what was experienced. I'd like to point out the erratic nature of multiplayer gaming. Other BF3 players wanted to kill me despite the fact I was trying to do a run-through, they were very unforgiving :p

I'd love to see a youtube video, at increased speed, of 64x [H]'ers following a choreographed script in a private server several times to give a consistent MP review. :D
 
what if this cpu plays incredibly well with 7000+ series gpu's?

a 2b transistor cpu to perform this poor doesnt make sense, it reminds me of early ps3 games with that Cell processor, and look at it now with games like Uncharted, Kz, etc.

there has to be a catch somewhere.
 
a 2b transistor cpu to perform this poor doesnt make sense, it reminds me of early ps3 games with that Cell processor, and look at it now with games like Uncharted, Kz, etc.

There is just one issue with that - Cell is not a moving target, it is a fixed hardware even for next few years. Bulldozer on other side is just a minor, short time player, so no one will bother optimizing for it. Even if they would do, those games will be out in 2-3 years, when Bulldozer will be obsolete anyway.

Bulldozer is probably a good design for highly threaded server applications, but not for general desktop usage.
 
There is just one issue with that - Cell is not a moving target, it is a fixed hardware even for next few years. Bulldozer on other side is just a minor, short time player, so no one will bother optimizing for it. Even if they would do, those games will be out in 2-3 years, when Bulldozer will be obsolete anyway.

Bulldozer is probably a good design for highly threaded server applications, but not for general desktop usage.

i guess its a sad day then for amd fans.
 
I was sincerely hoping my next build wouldn't be Sandy Bridge based but it looks like it will be now. I am not a hardcore AMD fan (rocking a Q9600 as we speak) but I definitely like the future proofing in mind that AMD seems to have with their processors. Unfortunately future proofing means nothing if what you have now is BETTER than what just came out.

Seriously. I am sincerely hoping there is something about this lackluster performance that isn't being shown or said. Even if I didn't go AMD on my next build, it would still be nice to see Intel actually feeling uncomfortable again.
 
I guess I will look at AMD CPUs again when Win 8 is released.

Right now I don't feel compelled to replace this 1055T @ 4.1GHz based on what I've seen. It's fine to support a single GPU.

Now I'm waiting on the Samsung SSD 830 and AMD's 7000 series GPU.
 
I still can't believe that people expected single-core performance to be equal or better than that Intel processors, holy lol.

And Civ5 has a lot of Intel code in it, it's not a good benchmark.

If I can get this and a mobo for less than a 2500k with a mobo then I'm all for it, great deal for gamers and they won't be supporting a criminal anti-competition pro-monopoly company either!
 
I had an AMD build very very briefly at the start of 2010, a 1090T with an MSI board and I was very unhappy. In fact, my FPS was so low in World of Warcraft when Cata launched, this was with a 5870 that the latency was so bad, I had to sit out raiding BWD because I made a few mistakes I wouldn't have made otherwise had I had at least 40 - 50 fps.

I eventually went to the 2600k SB and my FPS doubled / tripled.

AMD is bad for gaming. I don't care what anyone says. The 1090T, I had OC'd to 3.9Ghz. I am sure BD wouldn't be much different. With my 1090t at 3.9Ghz with everything turned up, I was seeing 23 .. 26 fps in Raids and in SW. SB, I saw 90's to 150+ FPS. Triple the frame rate.

With SB at 4.9GHz, in BF2, Metro 2033, World of Warcraft, many many other games, I saw 2x, and even a few times, 3x the FPS gain.

Bad AMD ... Bad.
 
I had an AMD build very very briefly at the start of 2010, a 1090T with an MSI board and I was very unhappy. In fact, my FPS was so low in World of Warcraft when Cata launched, this was with a 5870 that the latency was so bad, I had to sit out raiding BWD because I made a few mistakes I wouldn't have made otherwise had I had at least 40 - 50 fps.

I eventually went to the 2600k SB and my FPS doubled / tripled.

AMD is bad for gaming. I don't care what anyone says. The 1090T, I had OC'd to 3.9Ghz. I am sure BD wouldn't be much different. With my 1090t at 3.9Ghz with everything turned up, I was seeing 23 .. 26 fps in Raids and in SW. SB, I saw 90's to 150+ FPS. Triple the frame rate.

With SB at 4.9GHz, in BF2, Metro 2033, World of Warcraft, many many other games, I saw 2x, and even a few times, 3x the FPS gain.

Bad AMD ... Bad.

Sure you did , i bet you also have 3 580's in your system also .
 
People also should remember we are looking at power usage of 4 cores vs 8 cores.
Double comparable watts usage and it looks to look perfectly fine to me.

Also, holy crap didn't realize my 920 was eating that much juice.

I don't think it really matters how many cores a system has as we are looking for power consumption/performance ratios. Basically how much juice does it take to do the same job no matter the core count. The fact is that BD needs twice the cores to get the same job then, and only if the application is decently multithreaded. When this condition isn't met, performance is just crappy no matter how you put it.

Having had a Bloomfield, yeah those things were absolute fire breathing monsters but they got the job done.

I can't wait to see Brent's upcoming tri-CFX/SLI review with BD as i'm curious to see how each will do. We have seen in the past that a tri-SLI setup can be bottlenecked with a Bloomfield at 3.6ghz while the tri-CFX setup didn't appear to be cpu bound at those resolutions. I wonder if we will see a repeat of that.......
 
SixFootDuo is okay with a future where there's only Intel for x86 processors and they have less of a reason to lower prices. Ever.

Hopefully he's the 1%, and we are the arts majors camping out in a park.
 
I had an AMD build very very briefly at the start of 2010, a 1090T with an MSI board and I was very unhappy. In fact, my FPS was so low in World of Warcraft when Cata launched, this was with a 5870 that the latency was so bad, I had to sit out raiding BWD because I made a few mistakes I wouldn't have made otherwise had I had at least 40 - 50 fps.

I eventually went to the 2600k SB and my FPS doubled / tripled.

AMD is bad for gaming. I don't care what anyone says. The 1090T, I had OC'd to 3.9Ghz. I am sure BD wouldn't be much different. With my 1090t at 3.9Ghz with everything turned up, I was seeing 23 .. 26 fps in Raids and in SW. SB, I saw 90's to 150+ FPS. Triple the frame rate.

With SB at 4.9GHz, in BF2, Metro 2033, World of Warcraft, many many other games, I saw 2x, and even a few times, 3x the FPS gain.

Bad AMD ... Bad.

You obviously have no idea what you are doing. A 1090T runs WoW just fine. Hell, a Pentium-D would run WoW just fine.
 
I didnt have time to read all pages in this thread but I would be curious to see how xfire and sli affect things with bulldozer. It seems to me when more cards are installed more cpu power is used to process data in the games even at 5760x1200. (or maybe it is because I am using a 3x xfire 5770 rig) It seems to use more memory too. or maybe it could make bulldozer suck more? guess I will have to wait for Kyle to do more testing
 
Coming from a 1055t I can say my Sandy Bridge runs it, quite literally, multitudes better.

i know wow is a popular game , but its a game largely based on 2004 tech. Sanybridge has insansly good single thread performance and even though blizzard has worked on the engine and made it into a dx 10/11 engine i bet it still favors two single fast threads over 4+ threads.
 
Back
Top