Intel: CUDA Just A Footnote In Computing History

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
There has been a bit of back and forth banter between NVIDIA and Intel over the last few months that gets a little juicier each time one of the two camps gets the opportunity to slam the other. This week’s “I challenge you to a duel” slap with the white glove came in the form of these comments made by Intel.

In a Q&A session after announcing Intel’s 40th birthday, we asked Intel’s senior vice president and co-general manager of Intel Corporation's Digital Enterprise Group, Pat Gelsinger, where he saw GPGPU languages such as CUDA in the future. He said that they would be nothing more than ‘interesting footnotes in the history of computing annals.’

Is it just me or does that quote above sound a lot like Jack Black in School of Rock when he gives the whole “..you'll be a funny little footnote on my epic ass” speech. Definitely quote of the day.
 
To an extent they tell the truth a bit. But at the same time thats like saying dxx and opengl are just "footnotes". CUDA is about the only selling point that I see in nVidia at the moment.

To me CUDA is like DX10. If Vista didnt have dx10 90% of the gaming population that bought it probably wouldnt have gotten it cause well, there really wasnt any other selling point to the new OS. Cuda to me is a huge selling point for any nvidia series 8000 series and up.

Also with Intel and nVidia at war with each other its no shocker that they'll try their best to steal each others thunder. Keep on posting this stuff though, seeing the two go to war with each other is quite entertaining to say the least.
 
In response to the above comments, Nvidia's CEO has hidden a dead fish somewhere in Pat's office, and stuck a potato in his car's tailpipe.
 
To me CUDA is like DX10. If Vista didnt have dx10 90% of the gaming population that bought it probably wouldnt have gotten it cause well, there really wasnt any other selling point to the new OS. Cuda to me is a huge selling point for any nvidia series 8000 series and up.

True, DX10 was what ultimately swayed me into upgrading to Vista, but I'll fully admit that since then DX10 has been a footnote compared to the increased stability, convenience, and speed (if you have the hardware, yeah) Vista has offered to me- not to mention it looks a bit better to boot, xD Though I have, somewhat oddly (I noticed this w/the same hardware- I had my 8800GTS pre and post Vista and I only recently upgraded the mobo/RAM/cpu/HDD/etc... to what they are not; aka, after I had been using Vista for 6 months), noticed some very subtle but quite nice improvements in lighting and fluids/particles even in non-DX10 games... which is obviously odd.
 
To an extent they tell the truth a bit. But at the same time thats like saying dxx and opengl are just "footnotes". CUDA is about the only selling point that I see in nVidia at the moment.

To me CUDA is like DX10. If Vista didnt have dx10 90% of the gaming population that bought it probably wouldnt have gotten it cause well, there really wasnt any other selling point to the new OS. Cuda to me is a huge selling point for any nvidia series 8000 series and up.

Also with Intel and nVidia at war with each other its no shocker that they'll try their best to steal each others thunder. Keep on posting this stuff though, seeing the two go to war with each other is quite entertaining to say the least.

Nvidia is at war with AMD and Intel currently. As it stands Nvidia is screwed unless they get a workable product that is capable of running as a processor in a PC. AMD has fusion, and Intel has Larabee and there version of fusion. Nvidia needs to get more shit done faster or they are boned. Intel showed off Quake wars ray-traced at a fairly reasonable FPS. Another few years and It could be done more mainstream.

My honest opinion is Nvidia got so over confident recently over there domination of the GPU market, they lost track of the bigger picture. Intel has quite a bit more resources than Nvidia does to produce real architecture changes that will lead the future makeing the GPU idea obsolete. Nvidia and Intel should be working together in my book instead of cock blocking each other.

Just my opinion tho.
 
Microsoft will not let any company "own" that code.....open or not. They will suck up the standard.
 
Nvidia is at war with AMD and Intel currently. As it stands Nvidia is screwed unless they get a workable product that is capable of running as a processor in a PC. AMD has fusion, and Intel has Larabee and there version of fusion. Nvidia needs to get more shit done faster or they are boned. Intel showed off Quake wars ray-traced at a fairly reasonable FPS. Another few years and It could be done more mainstream.

My honest opinion is Nvidia got so over confident recently over there domination of the GPU market, they lost track of the bigger picture. Intel has quite a bit more resources than Nvidia does to produce real architecture changes that will lead the future makeing the GPU idea obsolete. Nvidia and Intel should be working together in my book instead of cock blocking each other.

Just my opinion tho.

Yeah... it definitely is... considering that ray tracing is never going to be used independent of rasterization in video games outside of Intel's dream world. And I certainly don't see graphics cards shipping-off anywhere. We are very, very far from any kind of theoretical graphical peak. And even if we do arrive at such a peak for 1920x1200 LCD's, the expected resolution will increase (as it has already) or a new, more performance-demanding displays that can output more realistic images (for example, modern stereoscopic 3D monitors) will arise. Hell, even LCD manufacturers right now are basically saying that 1080p is nothing and that they want to make the standard LCD rez twice that as the technology matures more. Whether LCD's will last that long without another technology overtaking them is another matter.
 
Intel talking smack about graphics is a laugh. AMD and NVIDIA have been slugging it out for ages and we benefit from that rivalry.

We see GPUs becoming capable massively parallel general computing platforms and that's got Intel scared, and if they say otherwise they are lying.

What's Intel got? Crap. And vapor ware. Their integrated graphics to-date have been a joke and that joke drags the entire industry down as developers sigh and say "damn, how do we get our app to run on all this craptastic Intel GMA hardware that cheap-ass OEMs love to ship?"

So, the reason Intel talks so much smack is because that's all they got to talk about. They need to put up or shut up.
 
CUDA Just A Footnote In Computing History
That is such a crap point.
Its just as relevant as "the Pentium is just a footnote Computing History" :rolleyes:
They are all bad losers :D
 
"CUDA Just a footnote in computing history"

Like Intel can talk? How about Netburst? That would be a footnote, but it's more of an embarassment. Sure, CUDA isn't the second coming, but soon we'll get a folding client that uses CUDA.

The company that spent untold millions, perhaps billions, on promoting the pentium name, as well as all those years, had to throw the name out after Pentium 4 was such a disaster. I don't think they're in a position to criticize.... yet. CUDA hasn't really hit mainstream. I think Intel is pissed because they don't have anything good to hit back with against CUDA or any discreet GPUs.
 
Nvidia is at war with AMD and Intel currently. As it stands Nvidia is screwed unless they get a workable product that is capable of running as a processor in a PC. AMD has fusion, and Intel has Larabee and there version of fusion. Nvidia needs to get more shit done faster or they are boned. Intel showed off Quake wars ray-traced at a fairly reasonable FPS. Another few years and It could be done more mainstream.

My honest opinion is Nvidia got so over confident recently over there domination of the GPU market, they lost track of the bigger picture. Intel has quite a bit more resources than Nvidia does to produce real architecture changes that will lead the future makeing the GPU idea obsolete. Nvidia and Intel should be working together in my book instead of cock blocking each other.

Just my opinion tho.

First, Fusion will *NOT* be in competition with discreet GPUs. It's primary competition will be integrated video. Second, Larabee is going to be complete shit. Its going to have shit drivers. Its going to be slow. I say this because so far Intel has done nothing to suggest anything contrary to that. Their current drivers suck, their current GPUs suck, and they have almost zero experience in making discreet GPUs. Their first attempt is not going to rock anyone's world.

Likewise, Nvidia isn't going ANYWHERE. They practically OWN the GPU market at the moment (although the 48xx series looks to win back some of that for ATI). Intel has nothing to compete with Nvidia. The GPU most certainly will not become obsolete in the next decade or so. Raytracing, while interesting, is not practical at all, is a waste of resources for minimal gain, and would still be faster on an nvidia GPU than it would be on an Intel CPU since it can easily be broken up into multiple threads.
 
And I hope INTEL is taking notes because its anything they ever put out in terms of graphics accelerators.
 
I really don't know how you could call the Pentium 4 a "disaster".

Well, looking at how P4 turned out in relation to their original claims that it could scale to 10+ Ghz...... yeah. The fact that a P4 running at 3.2 would get trashed by an Athlon 64 running at 2.0 was also sort of discouraging, but at least now they have to fight back against the Megahertz Myth that they created around the P4. I guess that "disaster" is a bit harsh, but really..... it fell far short of expectations they set for everyone.
 
the P4 is a POS Processor!
its pipeline is faar to long for anything BUT multimedia operations (something that the P4 was only good at)
At pure number-crunching the P4 fails hard

I had todo a load of benchmarks to convince my IT dept that my dept needed new PC's
When a 1.6GHz Pentium-M runs circles around a 3GHz P4 (Im talking 12Min todo a simulink sim w.r.t. 30min) - RAM,OS everything else the same - you know the P4 is a P.O.S

I would take a P3 over a P4 anyday, and def a Core2. P4's run hot as well. The P4 was basically a marketing chipset where intel used clkspeed to try to sell against AMD's performance.
 
First, Fusion will *NOT* be in competition with discreet GPUs. It's primary competition will be integrated video. Second, Larabee is going to be complete shit. Its going to have shit drivers. Its going to be slow. I say this because so far Intel has done nothing to suggest anything contrary to that. Their current drivers suck, their current GPUs suck, and they have almost zero experience in making discreet GPUs. Their first attempt is not going to rock anyone's world.

Likewise, Nvidia isn't going ANYWHERE. They practically OWN the GPU market at the moment (although the 48xx series looks to win back some of that for ATI). Intel has nothing to compete with Nvidia. The GPU most certainly will not become obsolete in the next decade or so. Raytracing, while interesting, is not practical at all, is a waste of resources for minimal gain, and would still be faster on an nvidia GPU than it would be on an Intel CPU since it can easily be broken up into multiple threads.

Intel is currently top on GPU market....
 
lol at those who say intel drivers suck..........their video solutions are pretty weak but they definately work as advertised for destop work their chipsets are second to none when it comes to stability
 
Nvidia is at war with AMD and Intel currently. As it stands Nvidia is screwed unless they get a workable product that is capable of running as a processor in a PC. AMD has fusion, and Intel has Larabee and there version of fusion. Nvidia needs to get more shit done faster or they are boned. Intel showed off Quake wars ray-traced at a fairly reasonable FPS. Another few years and It could be done more mainstream.

My honest opinion is Nvidia got so over confident recently over there domination of the GPU market, they lost track of the bigger picture. Intel has quite a bit more resources than Nvidia does to produce real architecture changes that will lead the future makeing the GPU idea obsolete. Nvidia and Intel should be working together in my book instead of cock blocking each other.

Just my opinion tho.

I think Intel is talking big because of more than a little fear personally. Tesla/Cuda actually is quite impressive for both consumers and research applications. Intel has got to know that Nvidia actually has them over the fire a bit. There is NO way Intel can compete watt for watt, dollar for dollar, with Tesla/Cuda right now. Nvidia took the idea of a parallel processing to the extreme and now that technology has actually hit this point where making x independently programmable processors on one chip is feasible. Heck man, I'd be more worried if I were Intel too. Intel designs monolithic processors. They're great for many things, but as efficient as something like Tesla/Cuda? As powerful as something like Tesla/Cuda? Nope.

I'm sure AMD will be able to leverage their stream processors in a similar fashion for consumer use at least. They'll need to get Double Precision FP running to make it worthwhile for most research applications first though.

Why is Intel worried? What's cheaper to build and run:

1) A 100 unit Beowulf cluster of Intel Quad 2.5GHz?

2) A 50 unit (or possibly less) Tesla/Cuda cluster that will still beat the Intel cluster?

If AMD/ATI goes whole hog (remember they already run F@H on their cards) and does this as well, it's going to really shoot Intel in the a$$.

I've read/heard ray tracing is coming for years, yet when the magic time always comes the tech demo was useless and everyone forgets about it. Heck don't forget that Intel makes 3D chipsets mate. Yeppers that 915 graphics chipset is extreme 3D. Never mind it's more of a 3D decelerator. I don't count on Intel to do anything other than 2D graphics right. The i740 only worked because they bought a 3D graphics company, the same goes for their 2D. Who are they going to buy now? Matrox? LOL.
 
I think it will be hard for CUDA to become a footnote in history when it has already shown its self incredibly powerful in REAL WORLD APPLICATIONS. This isn't tech demos, it's real people getting real orders of magnitude difference in preformance, for orders of magnitude less in price.
 
Vista will be a footnote among OS's that failed to promise like DOS 7.0. Many businesses are keeping XP/2000 and waiting for Windows 7.
 
I like CUDA. I use CUDA. CUDA is part of the product I am building, so I have some bias. With that out of the way:

This is just more mud-slinging by a scared Intel. My opinion is if Intel wasn't worried about GPU computing and CUDA in particular, they wouldn't have anything to say about it. NVIDIA has done a great thing with CUDA, and I think they can keep their advantage if they continue to put resources into it.

Gelsinger is clueless, IMHO, and just spouting FUD. Here is the perfect example:
‘The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvements, but you’ve just got to go through this little orifice called a new programming model,’ Gelsinger explained...

Meanwhile, Intel itself is saying the same thing about their "cool new idea":

http://arstechnica.com/news.ars/pos...xpensive-many-core-future-is-ahead-of-us.html

[The terascale approach] approach, Intel says, is the one to go for. Although it will probably cost more upfront—that kind of design needs to permeate the entire application and be built in from day one—it's going to yield dividends in the long run, because before too long, the processors we buy and use will have cores almost too many to count. Trying to tackle the parallel performance problem one core at a time might work at the moment, but this trend will not continue.

http://blogs.intel.com/research/2008/06/unwelcome_advice.php
Ultimately, the advice I’ll offer is that these developers should start thinking about tens, hundreds, and thousands of cores now in their algorithmic development and deployment pipeline.

As if redesigning algorithms to use thousands of cores isn't a "little orifice called a new programming model".

CUDA has some GPU and NVIDIA architecture specific things in it, but I think it is more powerful than that. I believe CUDA as a language is a method to express fine-grained parallelism in a relatively easy way. There's no technical reason a CUDA program couldn't be compiled and optimized to run on multi-core or "terascale" CPUs.

In the end, I think Intel realizes they are late to the massively multi-core game. Continuing to lose HPC market share and media attention to NVIDIA has them worried.
 
X86 IA cores in Larabee for Video Acceleration. Hmmm. So we'll get faster and more detailed 3d excel spreadsheets?

c'mon! Intel expects everyone to go back to the stone age of software accelerated graphics? Throwing 16-24 simplified x86 cores into a meta cpu/gpu chip for which intel is probably going to have to provide some kind of graphics-wrapper to translate DX9-10 programming is the best thing they can come up with?
 
lol at those who say intel drivers suck..........their video solutions are pretty weak but they definately work as advertised for destop work their chipsets are second to none when it comes to stability

Intel's GPU drivers are absolutely disgusting. They make me cry.

c'mon! Intel expects everyone to go back to the stone age of software accelerated graphics? Throwing 16-24 simplified x86 cores into a meta cpu/gpu chip for which intel is probably going to have to provide some kind of graphics-wrapper to translate DX9-10 programming is the best thing they can come up with?

To Intel apparantly everything should be solved using x86 :rolleyes:
 
For one, that has got to be the most assinine comment I have ever heard come from Intel.

Two, I have a feeling this is going to be another Itanium fiasco where it doesn't perform like it should.
 
Back
Top