Spy Pics: Intel Sandy Bridge Die Shot?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Could this be a spy shot of Intel’s upcoming Sandy Bridge processor? Who knows and the website’s name is Canard (definition: false or baseless report) so that’s not an encouraging sign but, from what I can get from the translation, this is supposed to be a quad core processor with a built in GPU and 8MB of shared cache. Real or not, it is always fun to speculate. Hit the link for more info, here is the die shot they have posted.
 
The way the die reflects light is not consistent so I bet that they photoshopped the pic.
 
I think intel could save some money by not outlining all the parts on the die and writing brief descriptions of each of the parts. White ink at that size and with that precision has got to be expensive.
 
The part that stands out is that AMD is falling further behind, and that's disturbing to certain people. ;)
----

Not all die shots are a specially prepared for marketing micrograph, especially not a leak. The picture may have been purposely blurred and some of the blocks look like they were color highlighted, which I've seen in many other die shots. But fake? Nothing stands out as being out of place. The cores are different from Nehalem or prior cores and the graphics section doesn't look like anything that's been released previously.
 
Last edited:
Bah Humbug Intel. Even if it is real, I'm still not comfortable with an on-die (CPU die that is) GPU.

-EGA
 
Bah Humbug Intel. Even if it is real, I'm still not comfortable with an on-die (CPU die that is) GPU.

-EGA

There will almost certainly be models without the integrated gpu. Because that would not be useful to the high-end, but the low-end would benefit from it.
 
Lap tops are going to certainly benefit from it. As will their batteries.
 
The way the die reflects light is not consistent so I bet that they photoshopped the pic.
No shit they photoshopped it. You didn't seriously think all that text and those yellow lines were really part of the CPU, did you? :p
 
The part that stands out is that AMD is falling further behind, and that's disturbing to certain people. ;)
----

Not all die shots are a specially prepared for marketing micrograph, especially not a leak. The picture may have been purposely blurred and some of the blocks look like they were color highlighted, which I've seen in many other die shots. But fake? Nothing stands out as being out of place. The cores are different from Nehalem or prior cores and the graphics section doesn't look like anything that's been released previously.
Exactly. I'm really puzzled why anyone thinks it's fake. What benefit is there?
 
Bah Humbug Intel. Even if it is real, I'm still not comfortable with an on-die (CPU die that is) GPU.

-EGA

What is the problem with the GPU being on the CPU die? Even if you wouldn't use it for displaying graphics it still can be used for running calculations it is better at than a CPU.
 
I feel like that the integrated GPU is away from the CPU's die size and therefore potential to be better, e.g. it could have bigger cache and so. But I'm no expert to analyze these. Maybe in ultra portable laptops something like that could be good, but not desktops expect low/lower mainstream.
 
Intel has talked about going this route. I can see these being a big hit on Notebook PC's and small... netbooks, I mean. Since you don't change out the GPU ever on them anyways, why not?
 
I feel like that the integrated GPU is away from the CPU's die size and therefore potential to be better, e.g. it could have bigger cache and so. But I'm no expert to analyze these. Maybe in ultra portable laptops something like that could be good, but not desktops expect low/lower mainstream.

You ever pop the lid on a CPU?
 
What parts of the shot stand out?

Just look at the colour tone, it is uneven. The lower side of the graphic core is bright, the upper side of the core #1 is darker, the right side of the northbridge is also bright. Why blurred? It's not like anyone can copy the design since you can't see what's in the 32nm process using a small resolution picture.
 
Last edited:
Just look at the colour tone, it is uneven. The lower side of the graphic core is bright, the upper side of the core #1 is darker, the right side of the northbridge is also bright. Why blurred? It's not like anyone can copy the design since you can't see what's in the 32nm process using a small resolution picture.

blur = protecting source of photo...
 
could be PCI-E v3. Or it could be another part of being the mainsteam chip. Dunno. I expect X78 (probably name for the s1366 chipset) will be v3.
 
Bah Humbug Intel. Even if it is real, I'm still not comfortable with an on-die (CPU die that is) GPU.

-EGA


Ehh what's wrong with that? It will be sweet for low power setups! Also I think AMD is planning on doing something like this too...
 
Also I think AMD is planning on doing something like this too...

Yes, AMD was supposed to have that (bulldozer) by early next year but that was before they got delayed with all the Phenom 1 issues and other issues.

BTW, this was supposed to be a quad core or better with an integrated ATI GPU(s).
 
bulldozer is the name for the micro-arch, like "nehalem" is. Fusion is their name for on-die GPU. It'll be a while before we see either of them. ETA is around the same time as sandy, assuming they don't have any more delays (laughable).
 
I heard 2011 for bulldozer. Personally I am really excited for the 32nm dual core laptop chips intel will be coming out with later this year...
 
Intel GPU = horseshit. And not likely to change soon. Unless they have some radical redesign that they've been working on for years, and have neglected to integrate ANY of it into their current products (which I suppose is possible).

I would cream my pants for a on-die stream-processor GPU from Nvidia (never gonna happen) or AMD (obviously with an AMD CPU).

AMD needs to do this fast, it would be awesome, as their GPUs are actually worth something beyond gaming, even for the high end you could run physx on the on-die and use your discreet GPU for graphics.
 
The integrated GPU is not meant for *you*, just the 80%+ of corporate and non-gaming users. Geez. :rolleyes: You're not the only user on the planet.

AMD's IGPs are still sub-low end. I don't undertand why people get so exited about it. The current fastest ATI IGP is about 1/2 the speed of a 3 generation old mid grade and totally unexciting X1600.
 
Back
Top