Intel kills of consumer version of larrabee

purefun65

Gawd
Joined
Apr 10, 2004
Messages
709
http://news.cnet.com/8301-13924_3-10409715-64.html
Intel said Friday that its Larrabee graphics processor is delayed and that the chip will initially appear as a software development platform only.

This is a blow to the world's largest chipmaker, which was looking to launch its first discrete (standalone) graphics chip in more than a decade.

"Larrabee silicon and software development are behind where we hoped to be at this point in the project," Intel spokesperson Nick Knupffer said Friday. "As a result, our first Larrabee product will not be launched as a standalone discrete graphics product," he said.

"Rather, it will be used as a software development platform for internal and external use," he added. Intel is not discussing what other versions may appear after the initial software development platform product, or "kit," is launched next year.

Larrabee, a chronically delayed chip, was originally expected to appear in 2008. It was slated to compete with discrete graphics chips from Nvidia and Advanced Micro Devices' ATI graphics unit.

Intel would not give a projected date for the Larrabee software development platform and is only saying "next year."

Intel says its plans are unchanged to deliver this month the first chip with graphics integrated onto the CPU. This new Atom processor is referred to as "Pineview" (the platform is called "Pine Trail") and will be targeted at Netbooks.
 
Cannot say I am not surprised. We've been hearing release date after release date for this sucker. Nvidia's CEO is finally right about something :p.
 
Well it sounds like there probably won't be a Larrabee card in 2010. But it really didn't sound like Larrabee is dead.
 
I think the point here is that the first generation silicon will not be used in a released graphics card (they probably realized that by the time it comes out, it will be obsolete.) The first generation silicon will be purely for development.

It heavily implies that the SECOND generation silicon (die shrink, etc,) will be used in a graphics card for consumer use. (i.e. when it will be competitive.)
 
Told you so. Intel makes crappy integrated graphics and that's all they will ever make till they change their idea of how GPU's should work.
 
I just wonder why Intel can't seem to develop a working, competitive chip with all the resources they have. They've had a graphics chip for years, they're not completely new in the business.
 
I said it before. Making a good GPU is a lot harder than making a CPU.

Now that Intel has dropped a generation, maybe its time to give up?
 
I just wonder why Intel can't seem to develop a working, competitive chip with all the resources they have. They've had a graphics chip for years, they're not completely new in the business.

Because Intel does not want anybody using graphics chips. They want everybody to do everything on an Intel CPU. Every graphics "solution" Intel has ever sold has only been produced to drive Intel CPU sales.

Larrabee is Intel's attempt to get software developers to abandon current programming methods that run on discreet graphics chips, and have them shift to making their software to run on a multi-core CPU (Larrabee is really just a bunch of CPU cores linked together). If Intel were to succeed there, they could eventually discontinue their "graphics" chip, and push future CPUs with tons of cores. [H] had a news post this week about an Intel CPU with something like 48 cores. THAT is what Intel wants you to buy instead of an ATI or nVidia graphics card.
 
I think a lot of us saw this coming, the real question in my opinion is whether Intel scraps the project completely or goes "double down". If they continue at the pace they've been going it seems like they will always be a generation behind.
 
hmmm weird they were so proud few days ago that they passed 1 teraflop barage doesn't make any sense
 
I've often thought Larrabee was more "scareware" meant to intimidate Nvidia and AMD than a real product the consumer would ever see.
 
I just wonder why Intel can't seem to develop a working, competitive chip with all the resources they have. They've had a graphics chip for years, they're not completely new in the business.
So maybe Amd buying Ati wasn't such a bad idea after all. Intel going into the graphics market this late in the game, might just be a big headache for them. I was wondering if their are just so many patents to go around, with amd and nvidia being in the game for so long. In a way I would like Intel to come out with something revolutionary.
 
Yes true but the I740 came out when Voodoo2's were the hot thing, and it was a decent little card. Ran 2d decent, decent 3d, etc. Ran one with my Voodoo2's in SLI.
 
Cannot say I am not surprised. We've been hearing release date after release date for this sucker. Nvidia's CEO is finally right about something :p.

Intel has never designed a gaming graphics chip so I'm not surprised. Before someone says i740, they bought that lock, stock, and barrel and let it languish into Intel "Extreme" graphics.
 
Intel has never designed a gaming graphics chip so I'm not surprised. Before someone says i740, they bought that lock, stock, and barrel and let it languish into Intel "Extreme" graphics.

is anyone really surprised? i predicted its failure over 2 years ago

CPU engineers cannot design a GPU
:rolleyes:
 
i find it funny how intel can make top class CPUs but they could never make even a half decent GPU for the life of them. go figure.
 
I think Fusion is going to be on the market before Larrabee. Intel could release a product with what they have, but they know it is going to be crap that will be laughed out of the market.
 
Yes true but the I740 came out when Voodoo2's were the hot thing, and it was a decent little card. Ran 2d decent, decent 3d, etc. Ran one with my Voodoo2's in SLI.

And wasn't designed by Intel. This baby was designed by Real3D.
 
i find it funny how intel can make top class CPUs but they could never make even a half decent GPU for the life of them. go figure.

Why? People here like to blur the lines between CPU and GPU, but that doesn't work in the real world. They are two different products that have very different development priorities. I never expected Intel to compete with ATI and Nvidia's higher end cards. Maybe Larrabee would have been competitive against the midrange cards (like 4670/9600 GT segments), but anyone who expected it to compete against the high end cards is a fool.
 
I was hoping that with their work on Larrabee, the Intel GMA rubbish would eventually improve and it still might, someday... but still be miles behind red/green's stuff. :(

I grow tired of family and friends coming to me with their GMA powered craptops asking why their games are not playing well or at all and if I can fix it for them. Its always "sorry, you have Intel craphics and it won't get any better, best stick with the Popcap/Reflexive type games..."
 
I grow tired of family and friends coming to me with their GMA powered craptops asking why their games are not playing well or at all and if I can fix it for them. Its always "sorry, you have Intel craphics and it won't get any better, best stick with the Popcap/Reflexive type games..."

haha, so true.
 
They should have did a mock up card imo just to stall the news instead of coming straight out and open about it.
 
And now for a huge:

"I told you so."

Intel wasn't even half prepared to properly enter the graphics market. I've got friends who work for Intel and Nvidia and they all universally agree on the old saying that "Intel fails whenever they try to go beyond their core product." They rock the house on CPUs but that's all they know how to do.
 
i find it funny how intel can make top class CPUs but they could never make even a half decent GPU for the life of them. go figure.

The reason is obvious, x86 stuff is their own stuff, they are the rule maker, they can mess around however they like, compilers are optimized for their CPUs. with the GPU, that is where the real tech counts, thats where they suck. Intel fails in any open competition, including the 64bit cpu instruction set. :D
 
including the 64bit cpu instruction set. :D

Isn't the whole reason we have a 64-bit instruction set due to AMD and their desire to keep the x86 platform alive? Wasn't Intel eventually going to move everything off x86?
 
Jeez. There is so many hatred Intel, here.

IMO, it is disappointing that Intel's graphics are so far behind schedule... but people who don't think Intel can pull off graphics are fooling themselves.

Why on earth would you think the company behind Pentium, Conroe, and i7 wouldn't be able to pull off massive parallel processing? Intel has massive potential.

My only hope is they don't lose focus like NVIDIA. It seems like Tegra, Ion, Tesla, desktop, and laptop graphics were too much for them to handle. But if Anand's not worried, neither am I. :)
 
IMO, it is disappointing that Intel's graphics are so far behind schedule... but people who don't think Intel can pull off graphics are fooling themselves.

Why? Intel hasn't made a decent performing, feature rich GPU driver ever. History isn't on their side.

Why on earth would you think the company behind Pentium, Conroe, and i7 wouldn't be able to pull off massive parallel processing? Intel has massive potential.

Because they are two different things? That's like saying "Lamborghini gave us awesome super cars, they must be experts at building pickup trucks!" Different tech, different concepts, different goals. Also, CPUs *aren't* massively parallel, so why would you think knowledge of CPUs would help with making a GPU? Because it really doesn't - they are extremely different in their architectures and strengths.

You are also talking about the company behind Prescott and claims of 10ghz CPUs - see how well that one worked out?

Isn't the whole reason we have a 64-bit instruction set due to AMD and their desire to keep the x86 platform alive? Wasn't Intel eventually going to move everything off x86?

I think Intel bet the farm on Itanium, but nobody wanted a painful 32-bit -> 64-bit switch.

They rock the house on CPUs but that's all they know how to do.

They do pretty good on chipsets, as well. X58 anyone? And of course there is the Centrino platform...
 
Isn't the whole reason we have a 64-bit instruction set due to AMD and their desire to keep the x86 platform alive? Wasn't Intel eventually going to move everything off x86?

Yes, actually, you can thank AMD and Microsoft. Intel was so excited about eventually moving everyone to Itanium, they never considered extending x86. AMD jumped on this while Intel dealt with their Itanium problem, and announced x86-64, an open standard they encouraged all x86 makers to adopt. Intel did nothing public in response, but privately they began developing their own implementation just in case.

Things got real when AMD convinced Microsoft to create a version of Windows for their new processor (XP 64). Intel then publicly announced their own (incompatible) version of 64-bit x86, and tried to get Microsoft to make yet-another version of XP 64 for their new processor (just assuming they would outsell AMD and shut them out of the new market).

Microsoft proclaimed that they would not make another version of Windows, so if Intel wanted to be part of the party, Intel would have to create a fully-compatible implementation of x86-64.
 
Back
Top