Is Sandy Bridge a video card supplement or replacement?

cyberslag5k

Limp Gawd
Joined
Jun 29, 2004
Messages
277
I'm a little confused about what exactly Sandy Bridge does in terms of graphics. I've searched around a lot, but I haven't really found a concrete answer to this. If you built a system using a Sandy Bridge processor (ignoring the issues it's having right now), would you not need a graphics card? Or would it just help increase performance alongside one?
 
Yes you still need a dedicated video card. The Sandy Bridge IGP is roughly equal to a Radeon 5450 so that doesnt mean much in the gaming world these days. Unless your playing flash games or really low end graphical games.
 
Think of it as the graphics card being integrated with the motherboard. The only real difference is that the GPU is on the chip instead of being mounted to the board. The current chips aren't really "gaming" chips, though they will meet the minimum spec for a lot of titles. In the future though, they may start rivaling discrete card performance, could be pretty cool. :)
 
I see. So they don't actually improve graphical performance any more than a regular CPU would, if used with a graphics card, right?

If Lucid Hydra actually comes to fruition, do you think they might start contributing in tandem with discrete cards?
 
I see. So they don't actually improve graphical performance any more than a regular CPU would, if used with a graphics card, right?
Depends on the graphics card, But in most cases No. A Discrete graphics card would be better.

If Lucid Hydra actually comes to fruition, do you think they might start contributing in tandem with discrete cards?
I wouldn't hold my breath on Lucid Hydra ever being anything more then a pipe dream.
 
I got a dumb question go easy on me. Why did they start integrating graphics into chips? Is this to gain a broader costumer base? Also wouldn't it be better if the chip was only for cpu stuff since now it looks like half of it is for graphics? Go easy on me:)
 
I got a dumb question go easy on me. Why did they start integrating graphics into chips? Is this to gain a broader costumer base? Also wouldn't it be better if the chip was only for cpu stuff since now it looks like half of it is for graphics? Go easy on me:)

because the enthusiast market is actually really really small.. Most of them are sold to OEM's for business type systems that will not have a descrete card & the overall cost is lower to make 1 CPU rather then a CPU + GPU
 
because the enthusiast market is actually really really small.. Most of them are sold to OEM's for business type systems that will not have a descrete card & the overall cost is lower to make 1 CPU rather then a CPU + GPU

Yep, they already moved the memory controller off the northbridge. Now the GPU is gone as well. The result is a very cheap northbridge that doesn't need to be high performance and eventually won't need cooling.

I see eventually, the CPU being the only chip, doing all the functions of the CPU, GPU, NB, and SB.
 
I got a dumb question go easy on me. Why did they start integrating graphics into chips? Is this to gain a broader costumer base? Also wouldn't it be better if the chip was only for cpu stuff since now it looks like half of it is for graphics? Go easy on me:)

In addition to bastage's response, it also cuts down on power and heat by combining the two together.
 
I see. So they don't actually improve graphical performance any more than a regular CPU would, if used with a graphics card, right?

If Lucid Hydra actually comes to fruition, do you think they might start contributing in tandem with discrete cards?

The on die GPU does have one feature that it does better then discrete GPUs which is video transcoding using "Quick Sync." However at the moment there is no way to leverage this feature while using a discrete GPU, although Lucid demoed a software solution at CES H67 chipsets.
 
Back
Top