Core 2 Extreme QX6700 vs PS3 Cell processor

You think? Can you explain how it would do that? Or at least link to an article that explains this?
In my experience it's generally incredibly inefficient to try and aid a GPU with its processing, because of locking issues on the framebuffer and such. Since the PS3 uses a standard nVidia GPU, I don't see how it would be different.

http://www.us.playstation.com/Media?id=15493
http://www.us.playstation.com/Media?id=15554
Maybe some adverts to you but this is what the game developer said.

Btw, if RSX is a 7600GT, I would rather play a game on the PS3 than playing the same title on a PC with QX6700 and 7600GT.
 
And I understand CPU power enough not to throw away my money for a new CPU, mobo and RAM that wouldn't help me much when playing games at a high resolution(note that my native resolution is 1920x1200). I'm pretty sure that even at 1600x1200 my crappy Opteron rig will be faster than your great C2D rig in these games:
Prey, Rise of Legend, Oblivion, Titan Quest, and Half Life 2.
This is because of my graphic specific processor(7950GT) is faster than your graphic specific processor(X1800XT). Eventhough your general purpose processor(e6600) is faster than mine(Opteron), your performance at a high resolution is still limited by your graphic specific processor. However on PS3, the Cell can be used to help the graphic specific processor(RSX) on the graphics workload.

Not it's not if I decrease my resolution on my CRT LOL! You do the same and post the results? Please Also note that if I wanted, a video card upgrade to a 7950, it would easily blow yours out the water, at stock speed. Throw money? You know damned well money has almost nothing to do with it. Come on, be honest, it pains you to watch Intel kick AMD's ass this way, doesn't it? Here's a secrete. If K8L comes out and Kicks Intel's booty, I'll get one of them as well.

That said, that's still beside the point of me asking you to look at something and you them going off on me for something you assumed. That's crazy! I just linked you to ARS just for their view on Paralleled Processing, nothing more nothing less.

Now regardless of how QX6700 does vs. 8800GTX, we ALL know kicks the crap out of what the Green Arrow Team has.
 
Not it's not if I decrease my resolution on my CRT LOL! You do the same and post the results? Please Also note that if I wanted, a video card upgrade to a 7950, it would easily blow yours out the water, at stock speed. Throw money? You know damned well money has almost nothing to do with it. Come on, be honest, it pains you to watch Intel kick AMD's ass this way, doesn't it? Here's a secrete. If K8L comes out and Kicks Intel's booty, I'll get one of them as well.

That said, that's still beside the point of me asking you to look at something and you them going off on me for something you assumed. That's crazy! I just linked you to ARS just for their view on Paralleled Processing, nothing more nothing less.

Now regardless of how QX6700 does vs. 8800GTX, we ALL know kicks the crap out of what the Green Arrow Team has.
It looks to me like you are the F@nb0y, looks like you've waited too long for Intel to come back at AMD. I would buy a C2D if I'm builing a new PC now but there is no point for me to upgrade to C2D from my rig. It is not a big deal for me if you lower the resolution to get 150fps and I'll get 100fps. I'm playing games with at least 1600x1200 resolution and if I want more performance in games, I wouldn't upgrade my CPU, mobo, and RAM to C2D but I'll spend the same money on an 8800GTX because I know that a new GPU will help me more than a new CPU at this resolution. Btw my friend has an exact same GPU like mine but he has an e6600 OCed to 2.9GHz. He also has a LCD monitor with a high resolution and at this resolution he said that there is no difference between his rig and mine. Can't you understand that when a GPU is the bottleneck, a faster CPU wouldn't make any difference at all. Look at the link I gave before, he is using 2x X1950XTX in Crossfire but at 1600x1200, the bottleneck is still the graphic card.
For our gaming tests, we’re running a mixture of high and low-res testing. We realize that most of you don’t game at 800x600, so we’re including results at 1280x1024 0xAA/0xAF as well as 1600x1200 with 4xAA/8xAF, which are settings more typical of someone with a Radeon X1950 XTX card. In the low resolution tests that stress CPU performance, the Core 2 Quad and Dell XPS system perform largely identical to that of Core 2 Duo E6700, which is to be expected since F.E.A.R. doesn’t take advantage of multi-threading. As you increase screen resolution, the bottleneck shifts to the graphics card and by 1600x1200 we’re GPU-bound and all the systems deliver similar performance.
 
You think? Can you explain how it would do that? Or at least link to an article that explains this?
In my experience it's generally incredibly inefficient to try and aid a GPU with its processing, because of locking issues on the framebuffer and such. Since the PS3 uses a standard nVidia GPU, I don't see how it would be different.
IEEE Spectrum (IEEE member magazine) had an article about the making of Resistance: Fall of Man not too long ago, I found an online version:

http://www.spectrum.ieee.org/print/4745

It's a fascinating article on how the Cell actually works in a game.
 
It looks to me like you are the F@nb0y, looks like you've waited too long for Intel to come back at AMD. I would buy a C2D if I'm builing a new PC now but there is no point for me to upgrade to C2D from my rig. It is not a big deal for me if you lower the resolution to get 150fps and I'll get 100fps. I'm playing games with at least 1600x1200 resolution and if I want more performance in games, I wouldn't upgrade my CPU, mobo, and RAM to C2D but I'll spend the same money on an 8800GTX because I know that a new GPU will help me more than a new CPU at this resolution. Btw my friend has an exact same GPU like mine but he has an e6600 OCed to 2.9GHz. He also has a LCD monitor with a high resolution and at this resolution he said that there is no difference between his rig and mine. Can't you understand that when a GPU is the bottleneck, a faster CPU wouldn't make any difference at all. Look at the link I gave before, he is using 2x X1950XTX in Crossfire but at 1600x1200, the bottleneck is still the graphic card.

All this just because I posted a simple link to an Intel Tera-scale *TEST processor and ARS's view on Parallel Processing, WOW!

Actually that's pretty Dumb since instead of waiting for Intel, I bought am Athlon 3500+ and Sc-939:) Again, try that on someone else? I vote with my Wallet. Talk costs nothing. My so called lower resolution is 1280 X 1024@32, hardly what many call Low-Res LOL!

Again, fly off the handle as much as you like, QX6700 kicks the shit out of anything AMD has=P Just more normal AMD BS from a Green Glasses wearing AMD F&nb0y! Sad, truly sad. Is it that hard to say that it burns you up that the Green Team is getting spanked. Yup, Intel was whipped and got reamed by AMD. As each went back and forth, I went right along with them and will continue to do so. There are 5 PCs in my house, 2 AMD and 3 Intel BTW. Being loyal to them, when they are not to you, is lame;)
 
IEEE Spectrum (IEEE member magazine) had an article about the making of Resistance: Fall of Man not too long ago, I found an online version:

http://www.spectrum.ieee.org/print/4745

It's a fascinating article on how the Cell actually works in a game.

Yea, but as I expected, the GPU and CPU stay well out of eachother's way, it seems:
In a PS3 game, the graphics are rendered mainly by a separate graphics processor, while the Cell orchestrates the rest of the in-game action: keeping track of players and monsters, computing the physics of how bodies rag-doll off buildings, and coordinating the collisions of bullets with alien flesh. The Cell conjures its magic by dividing the work among the nine processors.
 
All this just because I posted a simple link to an Intel Tera-scale *TEST processor and ARS's view on Parallel Processing, WOW!

Actually that's pretty Dumb since instead of waiting for Intel, I bought am Athlon 3500+ and Sc-939:) Again, try that on someone else? I vote with my Wallet. Talk costs nothing. My so called lower resolution is 1280 X 1024@32, hardly what many call Low-Res LOL!

Again, fly off the handle as much as you like, QX6700 kicks the shit out of anything AMD has=P Just more normal AMD BS from a Green Glasses wearing AMD F&nb0y! Sad, truly sad. Is it that hard to say that it burns you up that the Green Team is getting spanked. Yup, Intel was whipped and got reamed by AMD. As each went back and forth, I went right along with them and will continue to do so. There are 5 PCs in my house, 2 AMD and 3 Intel BTW. Being loyal to them, when they are not to you, is lame;)
I never said that AMD is better, I just said that I don't need to upgrade to C2D yet from my current rig because I'm really sure that my rig can blow your great C2D rig because I have a better GPU. I'm sure that you understand which CPU(or sound card) has more processing power but based on your posts, it seems to me that you never understand how the processing power is used. I would rather have a weak and small Lotus Elise than a big and strong Rolls Royce to go around a circuit if you could understand what I'm saying but I highly doubt it. I'm pretty sure that you are one of those f@nb0ys who think that your rig is much better than any other AMD rig just because you have a C2D.
 
8800gtx vs. core 2 quad? are u freakin kidding me?

I hope to god you're being sarcastic.
 
I never said that AMD is better, I just said that I don't need to upgrade to C2D yet from my current rig because I'm really sure that my rig can blow your great C2D rig because I have a better GPU. I'm sure that you understand which CPU(or sound card) has more processing power but based on your posts, it seems to me that you never understand how the processing power is used. I would rather have a weak and small Lotus Elise than a big and strong Rolls Royce to go around a circuit if you could understand what I'm saying but I highly doubt it. I'm pretty sure that you are one of those f@nb0ys who think that your rig is much better than any other AMD rig just because you have a C2D.

Note, on this thread, I didn't say rig, I said processor. I don't think a lowly X1800XT is better than a 7900 or if I had your rig and was looking to Game more, I'd get an 8800GTX instead of upgrading to a C2D. Each case of should I upgrade or not, doesn't have a generic or Default answer. In the same case if I could sell my that rig and get a nice return, I'd be on C2D like; "A Chicken on a June-bug":) Geeks aren't always Fanb0ys. They just hate hearing their beloved put down.

In the old BBS days, I remember Green and Blue guys never crossing the line. Fanb0ys proudly wore their tags. Over the years, the meaning has definitely changed.
 
Who would win in an eating contest a Mitsubishi Eclipse, or a bag of wet towels?
 
the OP should have asked:

"motorbike vs semi truck"

It's diffrent solutions for very diffrent applications... they do essentially very similar things but their audience and design is very diffrent... just like a motorbike and a semi truck.
 
the OP should have asked:

"motorbike vs semi truck"

It's diffrent solutions for very diffrent applications... they do essentially very similar things but their audience and design is very diffrent... just like a motorbike and a semi truck.

It's funny, I used to read similar argument about the P4 vs the A64's. P4 was the motorbike, it can haul a little bit of data at a very fast rate whereas the A64 was the semi truck, it can haul a whole bunch of data but at a much slower rate therefore making it more efficient and better.

Now with the C2D/C2Q being the newcomer, it's like comparing 2 semi trucks, but 1 with a 43 ft trailer (AMD) and one with a 50ft trailer (C2D). They both do a fabulous job no matter what the task. I think it's sad that people would bash on one over the other right now just to make themselves feel better about a recent expensive purchase or to try and enlarge the e-weenie.

As far as the Cell vs the QX6700.... it will all just depend on what's being processed and how it's been written/optimized. Plain and simple.
 
Sony's the one inviting the comparison. They are calling the PS3 a computer.
 
Sony's the one inviting the comparison. They are calling the PS3 a computer.

It is a computer in the technical sense. Hell, the Atari 2600 is a computer. It's just primarily used and programs are written for home gaming entertainment. ;)
 
Since this topic has came back to life, I think that it is better for me to resurrect this thread so that pro Intel people can post in one thread and pro Sony people can post in another thread to avoid flame war :p
 
Back
Top