AMD's ATI Hybrid CrossFire Sneak Peek Exclusive @ [H]

I agree with you 100% This a GREAT idea, or, ideas really - the two key points from my perspective:

1) Integrated graphics that are actually decent (this one isn't QUITE there, but almost)
-and-
2) Having 1 or 2 powerful GPU video cards sitting idle in your case completely asleep until it's time to unleash them like a couple wild beasts to rip through your 3D video games. WITHOUT having to reboot etc.

I've expressed before that I think it's RIDICULOUS that today's mid - high end 3D GPU cards can't switch to a low power (less than 50 watts) mode for regular desktop use.

When I put an 8800GTS 320 in my PC my ROOM GOT HOTTER, really. It's almost like having a hair dryer just sit there and run continuously. I like to leave my PC on 24/7, but not anymore since I gave up my 7900GT and put in an 8800GT, it just burns through too many watts.

Come on, it's well past about time that modern video cards have a wide range of power saving options, wtf?!?
I thought they did actually.

I'm sure my 8600gt has different power (voltage) settings depending on what mode it's in. It even has different speeds depending on what it's doing. Mind you, I do have to manually over/underclock both of the 2d and 3d settings with riva tuner. Which does actually lead me to wonder why 2d and 3d mhz are the same out of the box.

It does run cooler and the fan more silent in 2d mode, but it really doesn't need to be 540mhz in 2d mode, so I lowered it down to the lowest the scale would allow (235 or so mhz), and there is no difference in general 2d operations to me. I even used to run my old 6600gt at 130 mhz in 2d mode and it's just as fast as this 8600gt in 2d.

So yea, even if we could run a low power igp just for 2d, then just even have it fully then power up and completely switch to a more powerful add in card would be great, then when the game is finished, the game card then power off completly and the igp take over again would be so good.
 
I think you give sites like this far too much credit if you think they can fundamentally change the way such a big company does business like that... :eek: But that's what the site and enthusiasts are for, to pick up the slack and hopefully educate some minds. What's really sad is... Look at the number of replies to this thread, then look at the number of replies to the Tri-SLI thread, you can't make the horse drink the water!
 
What about a motherboard with this IGP but with a Crossfire setup, even if it was 4x/4x it probably wouldn't hinder these $50 cards very much. I could see 3 of these cards playing a game like CoD4 at something like 1024x768 or maybe even widescreen 1280x800 with high settings. Think that would be possible? 3-way Crossfire for ~$100 seems like a pretty cool idea to me I think. You know places like Dell could sell these because they're cheap and people think "3 video cards? Hot damn I gotta get me one of those".
 
The power savings feature would be an extremely high incentive for me to buy their CPU/Chipset/GPU. If AMD had the power savings feature on their current gen mobo I would actually giveup my 8800gtx and intel setup for their 2x 3870 (I doubt I'll notice the performance difference).
I leave my computer on probably 100hr per week. I probably only use the 3D mode 20hrs. If it saves 100w/hr that's 8kw per week that I can save, or about 24kw per month. I'm pretty sure the gtx uses more than 100w idle and some people leave their computers on 24/7.

That is definently a single feature that can sell a whole platform.
 
It appears that AMD will bring Hybrid Crossfire to laptops, thats awesome news, not only because of the performance increase but the energy savings.

Laptops would have a RS780 chipset with integrated DX10.1 and mobile M8x GPU also DX10.1

54.png
 
Well it sounds good to me. The more powerful the weakest links are (shitty video cards), the more people will be able to play decent games and not resort to dumbed-down consoles.
 
The power savings feature would be an extremely high incentive for me to buy their CPU/Chipset/GPU. If AMD had the power savings feature on their current gen mobo I would actually giveup my 8800gtx and intel setup for their 2x 3870 (I doubt I'll notice the performance difference).
I leave my computer on probably 100hr per week. I probably only use the 3D mode 20hrs. If it saves 100w/hr that's 8kw per week that I can save, or about 24kw per month. I'm pretty sure the gtx uses more than 100w idle and some people leave their computers on 24/7.

That is definently a single feature that can sell a whole platform.
I second that, my thoughts exactly behind this.
 
Then the point of this entire website is what again?

I think the point of this website is to give us a place to look at and talk about new technology. The fact is the average consumer doesn’t care and worse doesn’t know enough to even begin to ask the right questions. Honestly even if they did know enough to ask the right questions we all know the answers will not come from big box stores.

Back on topic and looking way outside the box as presented by AMD and Kyle think beyond GPUs for the moment and apply this concept to an entire computer as a whole.

What we have here is pretty much a power on demand situation which makes perfect sense. Take it all further, to the processors, subsystems and possibly right into the power supply.

At the moment it’s all pretty abstract and I’m the last guy to think green but a computer that completely scales on demand with no user intervention would be a pretty nifty item, have very broad appeal to all market sectors.

This of course takes us off topic and gives me a good stopping place, but ponder the overall concept. It just might make the hardware business interesting again.;)
 
so how exactly do we get this working?
what parts?
and what drivers?
and when?

gonna build my mom one
 
That is some incredible piece of tech!

These are the kinds of solutions we need to save PC gaming, I think. Most of the time system builders lure people in with nice intel/amd processors, 2GB of ram, and a big display all the while charging anywhere from 600-1500 dollars.

And little Timmy finds out he can't play any games on it because it came with integrated graphics or a geforce 6200, suddenly the ps3 or xbox looks like a much better deal.

This could really take off if AMD configures and markets this properly, and I hope they do.

or worse, little Timmy doesn't ever find out what a graphics card is, and dismisses the idea of playing games on PC.

Sorry to post something from page two, but thats a huge issue for me:

I get told so often that the newest games just obliterate the latest and greatest graphics. It's simply untrue, even in the example of Crysis. Consumers have a very limited knowledge and most are far to ignorant about it. Ram size, HDD size, and processor size, are basically the only things that make up a computer to them. I cant tell you how many times I've asked someone "whats your graphics card" to get the response: "a 256mb one". As I'm sure 90% of us know, an HD3850 w/ 256mb will simply de_stroy a 1gb 8500GT (both of which are real cards btw). Even if you don't know why, you know thats the case. Alas the vast majority of people don't understand the concept of ram.

OEMs are business's, and while some business's contain righteous and responsible people, the main function of a business is to make money. That is paramount. So when they're faced with the two options: put massive PR stats into a PC marketed as a gaming rig and kill the industry a little, or put low PR stats into a PC marketed as a gaming rig, include hardware thats relavent to the performance of a game, and sell massivly less PCs, guess which choice the business is going to make.

One thing that does get me though, Why are L2 sizes not bloated about in mainstream OEM PCs?
 
they still got some fire in them yet! perhaps why they skipped out on doing high end cards, they are looking at muliple GPU solutions for the masses that work and are worth it.

You need a single poster boy, and a very strong stance in the $150-$250 market. ATI has the HD3850 and HD3870, but no posterboy (an HD3890 that could outperform the 8800 Ultra, or at least on par with it). The net result is mediocre sales and the continual sales of the 8600GTS even though the HD3850 is a far superior part (while they maintain their current price relationship).

Mind you, Nvidia's completely screwed up this segment of the market. Seems to me the G92 was meant to be a 65nm part to take the huge hole between the 8600GTS and 8800GTS. For whatever reason, they decided to develop a second enthusiast chip instead with nothing but confusing figures.

No AMD's not done yet. The acquisition of ATi was defiantly a dumb move. If it wasn't for Core 2, Athlon 64/X2/FX sales alone probably could've completely subsidized the debt by not.

Currently it would seem no one from the R&D sector of AMD in the graphic, chipset, or processor devisions has been fired.

It's a foggy future, but I'm betting they'll come out OK. When you wake up, and realize that you gotta go to work, and get all sorts of things done it can be overwhelming, but the first thing you gotta do is get up. Lets just start with that. Launch R680 as best you can. Tweak Phenom as best you can as well, if the L3 problems cant be overcome, they can't be overcome. Don't divert attention away from Shanghai and Bulldozer to fix Phenom. I would much prefer another year of terrible AMD products then four more years of medeocre ones.

That said I strongly feel that if AMD can push out a 3.0GHz Phenom SKU, Core 2 Will have some trouble on its hands.

and of course, I want the abillity to do this with hybrid crossfire:
buy an HD3870 now
buy an HD4800 in 2 years
drop them in the same machine, and see performance accordingly. This IGP stuff doesn't really concern me, though it does concern the industry.
 
I can see this being really useful. I do see alot of budget minded customers wanting the ability to upgrade to better graphics after they buy a new system. The one thing to keep in mind is that with any high end graphics card a sub $500 PC usually only comes with a 300w power supply. In reality you would be looking at both a video card and powersupply and even then your constrained by proprietary design. You have potential depending on performance to offer an upgrade path to better graphics without higher cost. But there's always a breakpoint of diminishing returns.

I'd really like to see real world benchmarks on a teir 1 PC manufacturer. The results could be interesting.
 
That said I strongly feel that if AMD can push out a 3.0GHz Phenom SKU, Core 2 Will have some trouble on its hands.

A 3.0 ghz part won't make a difference as the Phenom will still be slower both on IPC and clockspeed and I think AMD knows that. They are better off pushing FUSION, the upcoming 3 cpu core + 1 gpu core looks promising.
 
again AMD completely missess the mark...


selling CF to uncle bob? yea, maybe..

But the true benefit of this would have been if mister enthusiast would have is 4x3870 cards completely off while the onboard graphics does windows... and then the onboard graphics can either go off, or join in crossfire when a game is run (not really NEEDED for that level of performance)
THAT is what this technology truly promises... And fails to deliver.
 
Little early to say it failed to deliver when they haven't implemented it yet don't you think? Maybe they're experimenting with HCF on the low end as a way to supplant some of the R&D cost or because they think it has potential in the long run? Maybe they figured they'd get the basics of the actual HCF process down before introducing it to enthusiasts as a form of power optimization. Whatever the reason, if they haven't even actually dabbled with power optimizations yet it seems silly to claim they failed to deliver.
 
I think most enthusiasts forget that most people run at resolutions they haven't touched in half a decade. If you look at Steam's survey:
http://www.steampowered.com/status/survey.html
80% of the Steam gamers run at 1280 wide or lower. Think they have SLI/xFire? I'm sure the whole computer computer gaming population is even more skewed towards the lower end. $$$
 
I think most enthusiasts forget that most people run at resolutions they haven't touched in half a decade. If you look at Steam's survey:
http://www.steampowered.com/status/survey.html
80% of the Steam gamers run at 1280 wide or lower. Think they have SLI/xFire? I'm sure the whole computer computer gaming population is even more skewed towards the lower end. $$$

cause most of them don't know about HardForum/Ocp yet! And a majority of them still buy vid card from BestBuy/Circuit City which is way overpriced
 
cause most of them don't know about HardForum/Ocp yet! And a majority of them still buy vid card from BestBuy/Circuit City which is way overpriced

Which is what I did, oh joy. At least its not my cash, it's my uncles. ;)
 
I don't like the idea of locking this technology down and not being able to use it on mid and high end cards. What's the point? Reminds me of the old 512Meg RAM maximum days. Totally pointless when the technology and know how to allow 1 Gig maximum had been available for years before it was finally implemented into the MB's Architecture.

This technology is already heading down that same path.

I agree with the guy that wants 4 HD's running and allow them to kick on or off when a game is fired up. Using IGP for windows......now that would be cool. Kinda like a Z1 Corvette. HP is there but throttles it back until needed.

If a car manufacturer can get a circuit board to do that......throttle back horsepower, surely somebody around AMD can get the same effect out of a combination GPU system. Maybe they should visit the engineers over at Chevrolet.
 
I think this is an awesome chance for AMD. They've always appealed to the masses. It's true that the consumer level outnumbers the enthusiast level by means of revenue so their intentions are well placed. I plan on watching this one.
 
Right now, I have an HD 3850 IceQ 3 Turbo X 512MB version installed in my other rig, with my X1900XTX IceQ3 edition sitting on a shelf.

*Yelling to the top of my lungs* WHY, OH WHY CAN'T ATI LET ME CROSSFIRE THOSE TWO TOGETHER?!?

With the X1900XTX being roughly equivalent in games overall with FSAA enabled, it would be a nice boost even if it only works in SFR mode theoretically. I'm going to make that wish to Santa this Xmas and see if it comes true (crossing my toes and fingers real hard... and holding my breath until I turn purple)... hey, I was only kidding!
 
*Yelling to the top of my lungs* WHY, OH WHY CAN'T ATI LET ME CROSSFIRE THOSE TWO TOGETHER?!?

Too bad that the x1900 xtx doesn't have the proper crossfire fingers, or else how can you even crossfire them? (maybe through the PCI-E bus, maybe...)
And I thought I read that hybrid crossfire is 'only' capable with same generation chips?

However, I feel for ya bro. I wish I can do that too...imagine the capabilities.
 
Would have been nice if you could have used the IGP as a GPU physics processor, but GPU physics seems to be pretty dead since Intel took over Havoc.
 
Little early to say it failed to deliver when they haven't implemented it yet don't you think? Maybe they're experimenting with HCF on the low end as a way to supplant some of the R&D cost or because they think it has potential in the long run? Maybe they figured they'd get the basics of the actual HCF process down before introducing it to enthusiasts as a form of power optimization. Whatever the reason, if they haven't even actually dabbled with power optimizations yet it seems silly to claim they failed to deliver.

I agree, why shoot down something that is unreleased?
The idea of experimenting HCF, then implementing the technology on the midrange/high tier cards is a big wet dream (well, for me that is :rolleyes:)
 
There is a set of Nvidia beta drivers that claim to get SLI working on Crysis, though I read it doesn't help much.

They're not out yet, and they're working with what they've got. Right now multi-gpu scaling is heavily dependent upon the application patch coming out for the game.
 
Fixed - Kyle... Hasn't Intel been itching to get back into the discrete GPU market for a while now (or to give it a 2nd try as it were)? HCF will leverage AMD/ATI tech into more budget systems if it pans out, I would think Intel would be as concerned by that as NVidia or anyone else... After all, integrated graphics is their market right now.

What do you mean "back into" discrete gpus? They've never been into them before..
 
Well I for one would like to pay only $200 for two graphics cards and have more than double the performance.
 
Back
Top