Intel's Larrabee Chip Needs Reality Check?

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to this report, some analyst believe that Intel needs a reality check when it comes to Larrabee’s ability to compete with NVIDIA and ATi.

"Intel claims Larrabee is a graphics engine intended to outperform Nvidia's GPU offerings. The audacity of this claim is startling," according to a report issued by Ashok Kumar, an analyst at investment bank Collins Stewart. "Nvidia has had over 10 years to optimize the 3D graphics pipeline, the necessary drivers, the platform connections needed to supply the memory bandwidth required, and to work with the software and apps developers," he writes.

Let’s play Devil’s advocate for a moment. Without comparing technologies or predicting “winners,” doesn’t it stand to reason that if you are like ATi and NVIDIA and currently dominate the discrete graphics market, the introduction of another GPU (even if only marginally successful) will cut into their share of the pie? Surely I am no expert but that would mean that Intel has everything to gain while the current GPU giants have everything to lose? You can find more info on Intel’s Larrabee here.
 
larrabee is something special. its the sort of device that may evolve into what we start trying to replicate the human brain on.
 
Well, it's not like Intel doesn't have some experience in the graphics field, however successful it has been on the integrated front. But hey, this is the world of analysts who know everything and anything. Intel is bound to fail. :p
 
I'm on the wait-and-see boat. It's difficult to believe that Intel can compete with nVidia and ATi considering their current IGP.
 
Hmm - perhaps the analyst owns ATI or Nvidia stock. I myself think another graphics manufacturer in the marketplace is great - competition is good for the consumer.
 
I myself think another graphics manufacturer in the marketplace is great - competition is good for the consumer.

I agree! Competition is good for the consumer all the way around...it spurs innovation and keep pricing competitive.
 
It may take a few years for the platform to fully blossom, but Intel will be setting the standard, if nothing more than because of their market presence.
 
Let’s play Devil’s advocate for a moment. Without comparing technologies or predicting “winners,” doesn’t it stand to reason that if you are like ATi and NVIDIA and currently dominate the discrete graphics market, the introduction of another GPU (even if only marginally successful) will cut into their share of the pie? Surely I am no expert but that would mean that Intel has everything to gain while the current GPU giants have everything to lose? You can find more info on Intel’s Larrabee here.


I think Intel has a lot to lose actually.

While even a marginal successful product may eat into the low end stuff of Green and red what about the costs Intel has poured into its card. The billions of R&D that will take years to recoup with a successful product and even longer with a slightly successful product.

It seems to me that Intel needs to make a big splash to get enthusiasts attention with this card else it will go the way of the Matrox and S3.
If they get the attention of enthusiasts then the momentum will be great else it will quickly be pushed into exile.
 
If they get the attention of enthusiasts then the momentum will be great else it will quickly be pushed into exile.

The enthusiast market is relatively small. I don't see why Intel's success would hinge on that. Even Nvidia and Ati make their bread and butter on the mainstream market, not the bleeding edge enthusiast technology.

Yeah, Intel is pouring a lot of money into this project but it seems to me their focus more on creating a market shift in graphics processing. A change for the long term, not just trying to be king of the hill for a day like the the battle between Ati and Nvidia.
 
Well, it's not like Intel doesn't have some experience in the graphics field, however successful it has been on the integrated front. But hey, this is the world of analysts who know everything and anything. Intel is bound to fail. :p
:p

Yeah, it must boggle the minds of some analysts that Intel is the #1 graphics vendor in the world.
 
The enthusiast market is relatively small. I don't see why Intel's success would hinge on that. Even Nvidia and Ati make their bread and butter on the mainstream market, not the bleeding edge enthusiast technology.

Yeah, Intel is pouring a lot of money into this project but it seems to me their focus more on creating a market shift in graphics processing. A change for the long term, not just trying to be king of the hill for a day like the the battle between Ati and Nvidia.

Enthusiasts drive the graphics market. If the best Intel can do is to produce a ~$100 card that competes at the lower midrange, devs will not worry much about supporting it, and people who AREN'T enthusiasts won't buy it because they will have been told by their enthusiast relatives or friends that nVidia or ATI is the way to go when they're looking for an upgrade to play WoW or StarCraft 2 on.

Enthusiasts may not account for much in terms of total sales volume, but I promise you they have a big impact as a group.
 
I wouldn't say "enthusiasts" drive the graphics market, but rather "gamers" or "games" drive the graphic market. There's a difference.

Grandma can be a gamer using HP computers she buys from Best Buy instead of building [H] rigs. I think that's what Intel is shooting for.
 
I agree! Competition is good for the consumer all the way around...it spurs innovation and keep pricing competitive.

yeah but intel? is that good for the longterm competition... intel practicaly run the show in the cpu/chipset market... do you really want... intel cpu, intel ram, intel board, intel gpu, intel case, intel psu, intel mouse, intel keyboard etc etc etc??
 
yeah but intel? is that good for the longterm competition... intel practicaly run the show in the cpu/chipset market... do you really want... intel cpu, intel ram, intel board, intel gpu, intel case, intel psu, intel mouse, intel keyboard etc etc etc??

Yes then they can become the Apple of PCs.
 
Enthusiasts drive the graphics market. If the best Intel can do is to produce a ~$100 card that competes at the lower midrange, devs will not worry much about supporting it, and people who AREN'T enthusiasts won't buy it because they will have been told by their enthusiast relatives or friends that nVidia or ATI is the way to go when they're looking for an upgrade to play WoW or StarCraft 2 on.

Enthusiasts may not account for much in terms of total sales volume, but I promise you they have a big impact as a group.

I think you hit the nail on the head there.

Also to consider, though, is that selling a card that is only ~$100 does not allow them to have much of a markup. They are investing a hell of a lot of money to just sell low end parts with low mark up. Making just low end parts is not a good investment for intel. They have to make higher end stuff with higher mark ups to make up for the loss on R&D.
 
Personally I hope that Larrabee provides some more competition in the graphics market. When one side dominates you see things like NVidia waiting 1 and a half years to release a card faster than the 8800 GTX.
Imagine a Larrabee on a 32nm process running at over 1Ghz. Intel's lead in process technology will provide the same kind of advantage that Intel has with CPUs.
 


i suck at trying to do that, never sounds proper to me and i end up deleting what ive written before posting.

ill let someone else explain the synergy between learning software and a hardware/software package that can be reconfigured nearly anyway.
 
Well, it's not like Intel doesn't have some experience in the graphics field, however successful it has been on the integrated front. But hey, this is the world of analysts who know everything and anything. Intel is bound to fail. :p

intel hasn't really been successful in teh integrated front.. they have always lagged behind ATI in performance and features... reason intel has market share is because of the billion office stations they sell through the likes of dell and stuff
 
What terrible reasoning. It doesn't matter if Nvidia had a billion years to perfect their technology. Anyone could come along with something new and better.
 
I am reminded of the time when Intel first got into the graphics market and their first AGP cards were supposed to be the start of a new push for gaming graphics cards. While their first AGP cards were actually fairly decent for the time, they never followed thru and the result is what we have today: integrated graphics that are good for little more than low end work. Given how much Intel tried to pump up their entry into the graphics market back then, I'm leery of what they're saying now.
 
All the greastest bleeding edge technology in the world will be useless if Intel can not get any developer support. Programming for Larrabee will be radically different than what devs are currently used to. This means a lot of time and money will need to be spent on training and developement of new tools. With the market in its current downturn I do not see devs and publishers willing to spend that kind if money. Look at the difficulties devs had with the Cell in the PS3 and how some even stayed away from the PS3 because of that.
 
I'm sorry but isn't the whole reason for explaining something to someone is because they don't understand it? :confused:
Unless that something is so easy and obvious the idiot shoulda knew already. It all depends on the idea of whether or not he shoulda knew.

What terrible reasoning. It doesn't matter if Nvidia had a billion years to perfect their technology. Anyone could come along with something new and better.

Yea, because of all your years of experience in bring graphic card technology to the market, you know this? /sarcasm Impossible, no? Improbable, yes.

I am reminded of the time when Intel first got into the graphics market and their first AGP cards were supposed to be the start of a new push for gaming graphics cards. While their first AGP cards were actually fairly decent for the time, they never followed thru and the result is what we have today: integrated graphics that are good for little more than low end work. Given how much Intel tried to pump up their entry into the graphics market back then, I'm leery of what they're saying now.
Haha, I remember that time too.
 
All the greastest bleeding edge technology in the world will be useless if Intel can not get any developer support. Programming for Larrabee will be radically different than what devs are currently used to. This means a lot of time and money will need to be spent on training and developement of new tools. With the market in its current downturn I do not see devs and publishers willing to spend that kind if money. Look at the difficulties devs had with the Cell in the PS3 and how some even stayed away from the PS3 because of that.

no, you need to do some reading.
 
Personally I hope that Larrabee provides some more competition in the graphics market. When one side dominates you see things like NVidia waiting 1 and a half years to release a card faster than the 8800 GTX.
Imagine a Larrabee on a 32nm process running at over 1Ghz. Intel's lead in process technology will provide the same kind of advantage that Intel has with CPUs.

+1
And, don't forget, Nvidia enjoyed high pricing all the while.

But, I think the article is underestimating Intel. I've never seen Intel ever come out with such solid products, prices, and timing. They're simply kicking ass. I find it more plausible now that their attempt in the GPU space will be nothing but successful.
 
i believe as long as the guys at intel stays true to the goal, they will work out

they've been in the processor building business for a long time, they had the chance to see what worked and what didn't in the past, and now they have seen a silent ati that came back with a big bang.

i can easily see intel get into it, much like how microsoft wedge it's butt into consoles

and like everyone said, the important thing is the competition, now we don't have to worry about nvidia rehash 8800s as 9800s, because there well be another competitor that hopefully will drive them to innovate
 
+1
And, don't forget, Nvidia enjoyed high pricing all the while.

But, I think the article is underestimating Intel. I've never seen Intel ever come out with such solid products, prices, and timing. They're simply kicking ass. I find it more plausible now that their attempt in the GPU space will be nothing but successful.

Like the first their last time in the GPU space? ;)
 
I think most people are forgetting that Larrabee isn’t really a GPU.

It's full of x86 cores and its DirectX, OpenGL support will be based on API’s and compliers etc two things that Intel is arguably the best in the world at.

I hope Intel has some success as I like the way they are taking the industry; the possible ability to upgrade your cards DirectX version without replacing any hardware is great.
 
yeah but intel? is that good for the longterm competition... intel practicaly run the show in the cpu/chipset market... do you really want... intel cpu, intel ram, intel board, intel gpu, intel case, intel psu, intel mouse, intel keyboard etc etc etc??

if it means consol like compatibality.. with the upgradeability of avrage pcs ( we nearly have 100% amd pc's as it is; short psu memory and case, the important parts are all there) minus the monoploistic values of apple i say bring it on ( and by that i mean the cutting out of their 3rd party vendors in the late 90s for absolute control over hardware and software) no reason not to buy a 100% intell pc, as long as i can toss a ati card in it if it happens to be faster.
 
All the greastest bleeding edge technology in the world will be useless if Intel can not get any developer support. Programming for Larrabee will be radically different than what devs are currently used to. This means a lot of time and money will need to be spent on training and developement of new tools. With the market in its current downturn I do not see devs and publishers willing to spend that kind if money. Look at the difficulties devs had with the Cell in the PS3 and how some even stayed away from the PS3 because of that.

no, you need to do some reading.

No, you do. Larrabee will suffer from the same "problem" Intel has been FUDding about Cell, CTM, Brook, and CUDA - Larrabee will require devs to go through Gelsinger's "little orifice called a new programming model" in order to program it.
 
you think intel cant write a direct x/ogl compiler? you do know intel is very very very good at compilers
 
Back
Top