It's obviously broken on AMD if the Fury X is performing around a 480 and lower than a 1060. Epic is completely in bed with Nvidia and wouldn't give AMD the time of day.
Still, the misleading story was all over the internet for three days and there's no way people at AMD didn't know about that. And yet no one bothered to set the story straight until now. Even with a gutted PR department, that's pretty lame.
Wow! We were all told by how many tech sites that there was something coming from RR about Project Win in three days, and on the day it's supposed to happen, *that's* when they tell us it's internal only!? Those staff cuts to PR must really be deep if no one could be bothered to clear things up...
Why would Nvidia want to merge with AMD? Their CPUs suck; incredibly BD is a step backwards compared to Phenom, and I highly doubt we'll see much improvement. The only good stuff from AMD these days are Radeons, which for Nvidia are redundant for graphics and behind in GPGPU.
I was thinking the same thing. Probably the person who thought of the idea and who was going to distribute the questions to the various departments and collect the answers is gone.
The module design aspect works pretty much as advertised. The problem is awful single thread performance (which indirectly slows multi-thread performance). If they took out the second integer unit from the module, it would still make for a pretty crappy 4-core CPU.
Higher clocks would've helped but the design or its implementation was lacking. Even with higher clock speeds than its predessor, it often fails to beat it. The only reason it beats the 1100T at anything is because they threw more cores at it. When the per-core performance goes down even farther...
Is the floating point unit truely shared between cores for 128-bit code? In other words, does one core get full access to both FMACs for 2 X 128-bit operations if the other core is not using the FP unit?
They don't know how to make good CPUs any more, that much is clear. From now on, no one will be surprised by another AMD CPU unless it's a good one. When you consider that the first iteration of BD that was supposed to come out a couple years ago was so bad they had to pull the plug and scrap it...
Other than the constant delays, one of the first warning signs for me was, ironically, that overclock record. AMD made a big deal about it, and while it's nice to have, it's ultimately meaningless. Anytime a company puts a lot of focus on hyping fluff like that, it usually means they got nothing...
AMD has had a long time to fix problems in BD with Piledriver, but all they expect is a 3-5% increase in IPC by increasing tables structure sizes and the like. They obviously have no intention to fix the cache problems (or whatever it is hampering performance) any time soon, probably never...
JF blames the engineers, but here's what I think happened (pure speculation). Some AMD folk who didn't directly work on the creation of BD were shown that if the software took advantage of new instruction sets on BD, IPC would increase. That was interpreted by some to mean IPC in general increased.
You mean when Nvidia bullies developers to increase tessellation to levels that create massive amounts of artificial, useless work for the GPU? Is that what you're talking about? Bravo. :rolleyes:
If it's true, it's sad. Not only are they getting the disadvantage of automated tools (let's say up to 20% bigger and slower), but they aren't getting the benefits. The benefits should be faster implementation and faster changes.
But they didn't get that. Fist, it took forever to get out...
I see this comment all the time, but a monopoly isn't illegal, just abusing it is. If AMD went away, it's not automatic that the government would step in and do something.
I have no doubt whatsoever that MS is going with another IBM CPU, for backwards compatibility, plus IBM can design it to Microsoft's specification. There's no chance at all the next Xbox CPU is anything but IBM.
If this and other stuff we've been seeing is what we'll be getting on the 12th, then AMD blows beyond belief. They could've made an X8 on the 32nm process with Llano cores for better ST and MT performance, probably a smaller die and no doubt saved truckloads of cash on R&D. Five years in...
I'd say you're the one missing the mark. If that's what your wife knows about computers, no one would be asking her for advice about them and you wouldn't let her influence your decision about buying one together.
My wife hardly knows anything about computers. Two of her sisters were buying...
Got tired of delay after delay with no release date forthcoming, so finally just got an am3+ board and a cheap 965 to go with it. Since 4 cores is more than enough for gaming anyways, I won't be upgrading to a BD any time soon.
Still anxious to find out how it stacks up, but I don't think...
I'm guessing a few of you have dealt with XFX rebates before. I bought an XFX 6870 which entitled me to a mail-in rebate which I sent. Yesterday I got an email that they had received it, when I clicked the link to go to their rebate zone tracking website, my address was all messed up. If my...
If BD is a true competitor to Nehalem in lightly threaded games and other apps, I'll consider that pretty good. I think that's the best AMD will be able to muster.
I can't find it now, but I've read the average packing rate in games was 3.4 (not sure if that was all shaders or just pixels). That's pretty good. It's more efficient in terms of die space than Nvidia.
So every test was done using an additional Radeon...(?) How about some gaming benchmarks that show just what the Llano chip itself can do? That's probably more relevant for more people. Yeah, you'd have to turn down some settings and maybe the resolution, but it should still be able to provide a...
I see ATI getting a lot of flak over the price, but what do you people expect? It's a 2nd salvage part, it's doubtful there are a lot of them especially going forward, and I'm sure they figure that they can sell most of them for $230. Or they could sell about the same quantity for cheaper and...
The 4800 series is closer to the 5800s than the 3800s, but I consider everything from the 2900XT up to the 5870 to be in the same general architectural family.
When ATI introduced the 2900, they did call it "VLIW superscalar", but I'm pretty sure it was more about marketing than anything else. Nvidia had gone with a scalar architecture, and "superscalar" just sounds better.