Lucid Hydra 200 - It's almost here!!!

Joined
Jan 18, 2008
Messages
1,019
"...you buy two 5870's, you get (very near) double the performance" - my quote from below

"Lucid Hydra 200: Vendor Agnostic Multi-GPU, Available in 30 Days"

"... Even more impressive is Lucid's claim that you can mix and match GPUs of different performance levels. For example you could put a GeForce GTX 285 and a GeForce 9800 GTX in parallel and the two would be perfectly load balanced by Lucid's hardware; you'd get a real speedup. Eventually, Lucid will also enable multi-GPU configurations from different vendors (e.g. one NVIDIA GPU + one AMD GPU)."

http://www.anandtech.com/video/showdoc.aspx?i=3646


Video - PC Perspective talks with Lucid about HYDRA. - Shows Devil May Cry 4 and Bioshock running on an Nvidia and ATI hardware combo.
http://vimeo.com/6700209


"Why is Lucid Hydra 200 better than CrossFire, SLI?"
http://www.brightsideofnews.com/news/2009/9/25/interview-why-is-lucid-hydra-200-better-than-crossfire2c-sli.aspx


9/30/09 - A bit more in-depth "Lucid HYDRA 200 Details With AMD, Lucid & NVIDIA" - Some questions answered, some more created...
http://www.legitreviews.com/article/1093/1/


10/08/09 - http://www.extremetech.com/article2/0,2845,2353999,00.asp Not much new info here, good filler none the less.


10/14/09 - "AMD Responds to Lucid Hydra Claims" per above ExtremeTech article.
http://www.extremetech.com/article2/0,2845,2354256,00.asp


OLD Lucid Hydra 100 info for those interested, some good general Hydra info though:
Aug 19, 2008 - http://www.pcper.com/article.php?aid=607&type=expert&pid=1
Aug 22, 2008 - http://www.techreport.com/articles.x/15367


Looks like we're going to have to wait a little longer, seems like they need more time to get their shit together...

10/29/09 - "MSI Big Bang Fuzion (Hydra) motherboard Scheduled for Q1 2010"
http://www.guru3d.com/news/msi-big-bang-fuzion-hydra-motherboard-q1-2010/

11/04/09 - More developments: Charlie at semiaccurate.com has an article up titled "Nvidia crushes MSI's Lucid based board" - http://www.semiaccurate.com/2009/11/04/nvidia-crushes-msis-lucid-based-board/ which has caused quite a stir.

11/04/09 - Update to above: MSI Responds to Charlie's "Nvidia crushes MSI's Lucid based board" Article (same link above)

From MSI's response - "The MSI Big Bang Fuzion (Hydra 200) hardware is ready. Currently Lucid is optimizing the driver for Windows 7 so that it works stable and in all configurations (Including Mix & Match mode). Because MSI is dedicated to bring high quality and stable product on the market we decided to postpone the Big Bang Fuzion (Hydra 200) pending the MSI internal qualification assurance test. The Big Bang Fuzion (Hydra 200) will be released when it’s driver is finished which is most likely Q1 2010."

11/11/09 - Some performance previews using the Lucidlogix development platform contraption (Not the MSI Big Bang Fusion board)

http://www.pcper.com/article.php?aid=815&type=expert&pid=1

http://hothardware.com/Articles/Lucid-Hydra-200-MultiGPU-Performance-Revealed/?page=1

http://www.techreport.com/articles.x/17934

11/13/09 - "Nvidia says it didn't, won't block Lucid's Hydra" - http://techreport.com/discussions.x/17962

*** ATTENTION *** 11/19/09 - "MSI Big Bang Fuzion Hydra 200: specs and games compatibility list" revealed according to VR-Zone - http://vr-zone.com/forums/510362/msi-big-bang-fuzion-hydra-200-specs-and-games-compatibility-list.html

12/09/09 - MSI Big Bang'in with Lucid Hydra @ [H]ard|OCP (Just pics and scans for now) - http://hardocp.com/news/2009/12/09/msi_big_bangin_lucid_hydra/
 
Last edited:
fuck the red+green shit. I'm just hoping the Dual/Triple GPU scaling is going to be 90%+
 
I can't believe this is actually coming to market. We all thought it was vaporware. Wow.
 
This is the first I've heard of this. Very interesting. I was actually thinking about something like this the other day, wondering if it would be possible. Different video cards and all.
 
Until Kyle and Anand get their hands on it I'll remain skeptical on its effectiveness. But hopefully we won't wait for much longer.
 
MSI has some BIG political hurdles on this one. NVIDIA has told me that it may very well code so that this Lucid silicon is worthless with its products and that pretty much nullifies the thing. After you think through the novelty for a moment, who the shit cares about mixing and matching video cards?
 
Once it's out and works better than nVidia's and AMD's game profiles, then I'll get excited.
Won't hold my breath waiting since they know AMD and nVidia's hardware better than the makers do, it shouldn't take long.... :rolleyes:
 
It's crap, it allways has been and allways will be. Both Nvidia and ATI will code it into oblivion.

edit: looks like Kyle beat me to the punch and made my speculation real!
 
It's crap, it allways has been and allways will be. Both Nvidia and ATI will code it into oblivion.

edit: looks like Kyle beat me to the punch and made my speculation real!

Ok, so let me in on your secret. How are you able to see into the future?

Also, Kyle didn't say shit. He said that nvidia "MAY" very well code it into oblivion". Also, I can see NV being pricks, but not AMD. If Lucid is Intel property, then it's going to work with intel GPUs. AMD would be crazy to not allow their cards to work. If Intel isn't going to give them much chips/any chips, then they're going to lose sales in chipsets and processors. If that happens, all they have is their graphics cards. Nobody is going to buy their cards if they can get more bang/buck from Intel because of the near-linear scaling that the lucid chip (apparently) allows.

but until we see it in the flesh, and see what it actually does, there's nothing more to say.
 
I'm very skeptical. If Lucid's method of dividing up the rendering between multiple GPUs were superior to the AFR/scissor methods used by Nvidia or ATI, something tells me that ATI and Nvidia would already be doing it this way. Or at the very least, one of them would've bought out Lucid if they thought the technology were that valuable/dangerous.

Wait and see. Maybe we will be pleasantly surprised, but I'm not holding my breath.
 
I'm very skeptical. If Lucid's method of dividing up the rendering between multiple GPUs were superior to the AFR/scissor methods used by Nvidia or ATI, something tells me that ATI and Nvidia would already be doing it this way. Or at the very least, one of them would've bought out Lucid if they thought the technology were that valuable/dangerous.

Wait and see. Maybe we will be pleasantly surprised, but I'm not holding my breath.

Funny thing is, the why-didn't-I-think-of-that kind of reason can sometimes be exactly that. Perhaps they did throw the idea around but found it unfeasible. Remember for this to work you need fast ASIC on a MB, custom driver support, at least the blessing of MS (you are messing with the DX pipeline) and a good DX task level load sharing algorithm. Overall I'm sceptical but hoping to be amazed. One concern even without vendor lock out though is that it's DX only.
 
Ok, so let me in on your secret. How are you able to see into the future?

Also, Kyle didn't say shit. He said that nvidia "MAY" very well code it into oblivion". Also, I can see NV being pricks, but not AMD. If Lucid is Intel property, then it's going to work with intel GPUs. AMD would be crazy to not allow their cards to work. If Intel isn't going to give them much chips/any chips, then they're going to lose sales in chipsets and processors. If that happens, all they have is their graphics cards. Nobody is going to buy their cards if they can get more bang/buck from Intel because of the near-linear scaling that the lucid chip (apparently) allows.

but until we see it in the flesh, and see what it actually does, there's nothing more to say.

Because I'm not an idiot?

1) Theoretically if I take a GTX 280 and SLI it with a 8800 GS I gain what? 10% THEORETICALLY? Put any kind of real world scaling into that equation and you get less performance than you started with.
2) Oh well what if I take a 5870 and put it with a 4850? Well Assuming a performance of 3 to 1 and each card operates at 80% efficiency you'll net only a 6.7% increase in fps.
3) Any time you add a circuit to a system you have increased latency. Look at Nf200
4) Why the hell would you ever want to put a 280 and a 4870 in the same system? Either the 4870 is a better deal or the 280 is a better deal.
5) Increased failure due to extra components
6) overclocking bottle neck
7) added cost
8) If it actually threatens Nvidia OR AMDs profit margin they WILL code it into oblivion.
9) Intel’s GPUs are shit compared to a mid range AIB. You'll never see any benefit.
10) Theoretically you CAN NOT have 100% scaling. Some work must be done by the GPU to pass around the information.
11) As far as Intel holding out on AMD, it's called MAD cross licensing, it means Intel doesn't have a choice when it comes to the x86.
12) If you're talking about Intel trying to not sell chips to AMD, it's called anti-trust.

There is no free lunch. Free lunch products are vapor ware or don’t work as advertised.
 
I'm very skeptical. If Lucid's method of dividing up the rendering between multiple GPUs were superior to the AFR/scissor methods used by Nvidia or ATI, something tells me that ATI and Nvidia would already be doing it this way. Or at the very least, one of them would've bought out Lucid if they thought the technology were that valuable/dangerous.

Wait and see. Maybe we will be pleasantly surprised, but I'm not holding my breath.

ATI & Nvidia couldn't buy Lucid because Intel did
 
The funniest thing would be if these guys did find a better solution than either proprietary ones, regardless of what happens to it in the end...simply because Nvidia and ATI deliberately chose less efficient scaling techniques just so that they couldn't be replicated or mixed and matched.
 
Because I'm not an idiot?

1) Theoretically if I take a GTX 280 and SLI it with a 8800 GS I gain what? 10% THEORETICALLY? Put any kind of real world scaling into that equation and you get less performance than you started with.
2) Oh well what if I take a 5870 and put it with a 4850? Well Assuming a performance of 3 to 1 and each card operates at 80% efficiency you'll net only a 6.7% increase in fps.
3) Any time you add a circuit to a system you have increased latency. Look at Nf200
4) Why the hell would you ever want to put a 280 and a 4870 in the same system? Either the 4870 is a better deal or the 280 is a better deal.
5) Increased failure due to extra components
6) overclocking bottle neck
7) added cost
8) If it actually threatens Nvidia OR AMDs profit margin they WILL code it into oblivion.
9) Intel’s GPUs are shit compared to a mid range AIB. You'll never see any benefit.
10) Theoretically you CAN NOT have 100% scaling. Some work must be done by the GPU to pass around the information.
11) As far as Intel holding out on AMD, it's called MAD cross licensing, it means Intel doesn't have a choice when it comes to the x86.
12) If you're talking about Intel trying to not sell chips to AMD, it's called anti-trust.

There is no free lunch. Free lunch products are vapor ware or don’t work as advertised.

1) I have no interest in using old and busted with new hotness
2) See above
3)Maybe they can put out a fucking "Xtreme edition" so you don't have to wait the extra fractions of a fraction of a fraction of a second.
4)No interest in team green GPUs
5)This is fucking 2009/2010; buy from a reputable company or buy when 2nd generation Lucid boards come out
6) For some, yes.
7)Adding $25 to an already $200+ board isn't really a big deal to me if it allows me to use all my cards at near-full potential
8)Probably, but we'll see how good the scaling is.
9)Larrabee was rumored to have GTX285 performance (and this was a while ago). It wouldn't shock me if they're at GTX295 numbers now. I'd hardly call that shitty. Maybe their first gen isn't there, but who knows with their second gen.
10)They never promised 100% scaling and I'm never going to expect it.
11)What the fuck does a chip that isn't a processor have to do with x86 (excuse my ignorance). Is x86 public domain anyway? (I don't think so?)
12)I pine for the days when we could sell whatever we wanted to whoever we wanted for whatever reason we wanted. (on a side note, that seems illegal. "making" Intel sell it's product to help better their competition. Shady shit)
 
ATI & Nvidia couldn't buy Lucid because Intel did

Ah I forgot about that.

Anyway I am still very skeptical. They're a new startup claiming a new technology that can do something better than the big, established players could do, in a HIGHLY-competitive market that has seen dozens of product generations already.

That is practically the recipe for disappointment (See also: Bitboys and Transmeta, off the top of my head). Of course Hydra is different in that it's not just vaporware, but I'm waiting to see how it actually works before I'll allow myself to get excited.
 
1) I have no interest in using old and busted with new hotness
Then why not use SLI or Crossfire, both of which are showing damn near perfect scaling results in many games these days?

3)Maybe they can put out a fucking "Xtreme edition" so you don't have to wait the extra fractions of a fraction of a fraction of a second.
If you end up with so much as a 1 frame delay at 30fps is 33ms of lag introduced into the system. Your 67 ping in an online game will now feel like 100ms, to me that is a problem.

4)No interest in team green GPUs
Then why not use SLI or Crossfire, both of which are showing damn near perfect scaling in some games.


5)This is fucking 2009/2010; buy from a reputable company or buy when 2nd generation Lucid boards come out
For the same reason I don't want NF200 on my motherboard. It doesn't add anything but latency, and from above in your point of view there is no reason to use it over SLI.

7)Adding $25 to an already $200+ board isn't really a big deal to me if it allows me to use all my cards at near-full potential
How much better than SLI scaling do you expect?

8)Probably, but we'll see how good the scaling is.
A major factor for sure. Of course also the factor they have been so tight lipped about. Makes me wonder why they would be so tight lipped about it if it was so great.

9)Larrabee was rumored to have GTX285 performance (and this was a while ago). It wouldn't shock me if they're at GTX295 numbers now. I'd hardly call that shitty. Maybe their first gen isn't there, but who knows with their second gen.
The latest rumors were in june (july?) that said it was near 280 numbers and would be launched mid next year. It was also on a die size of over 900mm. That's 3 times the die size (and 9 times the failure rate iirc) of a 5870, yet it will only be half as powerful. That is going to be economical or practical? The rumors have allways put Larrabee a generation or two behind the current tech. And based on their die size, they have some real problems with cost.

10)They never promised 100% scaling and I'm never going to expect it.
So considering this is going to be released on a P55 which already supports SLI and crossfire, why would you want this? The days of 25% and 50% scaling are long gone.

12)I pine for the days when we could sell whatever we wanted to whoever we wanted for whatever reason we wanted. (on a side note, that seems illegal. "making" Intel sell it's product to help better their competition. Shady shit)
You can, if you are a small company. Big companies are subject to anti-trust legislation. Do you really want the power company to decide since it is the only one who sales power to start charging you 2,000$ a month for it? Do you want Intel to simply buy AMD and Nvidia and start charging 5,000$ for a CPU (because trust me, intel could buy both)? Anti-trust legislation is a good thing.
 
This doesn't seem to make too much sense for premium boards. If you're buying a premium priced motherboard, you'll probably be buying premium priced graphics cards. Not sure why you'd want to run a 5870 with a 260, for instance. I mean yes, if you already have a 260 and are upgrading to a 5870 then why not use the extra performance? But at the same time, sell of your 260 and get another 5870 since you're looking for performance. No need to have a low-end card sucking up power for a small(?) increase in performance when you can use a matching high-end card.
 
This doesn't seem to make too much sense for premium boards. If you're buying a premium priced motherboard, you'll probably be buying premium priced graphics cards. Not sure why you'd want to run a 5870 with a 260, for instance. I mean yes, if you already have a 260 and are upgrading to a 5870 then why not use the extra performance? But at the same time, sell of your 260 and get another 5870 since you're looking for performance. No need to have a low-end card sucking up power for a small(?) increase in performance when you can use a matching high-end card.

That is not the main point of the tech., it is an added bonus though. The big deal with the Hydra chip is its very near linear scaling, you have two 5870's you get double the performance and no need to wait for game specific driver optimizations to fix your new game that SLI/CF is not working well with.

And I don't think adding that 260 to a 5870 would be a small increase, (if this all works as stated) you know what a 260 can do today, imagine nearly all that power added to the 5870.
 
Last edited:
That is not the main point of the tech., it is an added bonus though. The big deal with the Hydra chip is its very near linear scaling, you have two 5870's you get double the performance and no need to wait for game specific driver optimizations to fix your new game that SLI/CF is working well with.

And I don't think adding that 260 to a 5870 would be a small increase, (if this all works as stated) you know what a 260 can do today, imagine nearly all that power added to the 5870.

Considering the 5870 to be twice the preformance of a GTX 280 and the 260 to be 75% the preformance of a 280, adding a GTX 260 with PERFECT scaling gets you 35% increase. If you loose 10% preformance of each card, you now have about 23% increase. Hardly ground breaking.
 
glad all the haters can see into the future. Jesus wait till the thing actually comes out. If Ati and Intel both support it with there video cards then that leaves Nvidia in a position to stay competitive and allow there's to use it or lose serious market share.

Not sure if some of you have a crystal ball or crystal meth.
 
glad all the haters can see into the future. Jesus wait till the thing actually comes out. If Ati and Intel both support it with there video cards then that leaves Nvidia in a position to stay competitive and allow there's to use it or lose serious market share.

Not sure if some of you have a crystal ball or crystal meth.

I predict the stock market will not crash tomorrow.

You act as if it's a huge leap. This isn't a huge leap of faith to see this thing crashing and burning.
 
My biggest concern is latency and how they leverage the technologies from previous generation cards. For example, let's say your playing a DX 11 game and have a 5870 and a 2900xt ( I know, pretty extreme). How does the game utilize the 2900xt since it doesn't contain DX11 instruction sets? I'm guessing this technology has limitations to what you can mix and match depending on the application, but I could be wrong. Just my 2 cents:confused:
 
Considering the 5870 to be twice the preformance of a GTX 280 and the 260 to be 75% the preformance of a 280, adding a GTX 260 with PERFECT scaling gets you 35% increase. If you loose 10% preformance of each card, you now have about 23% increase. Hardly ground breaking.

Um, what? that is awesome. If you have a GTX 260 sitting around it would be okay to assume that you upgraded from the GTX 260 to the 5870, now rather than having to set the 260 aside and sell it or something you can keep it in your system with, if as you say, a 23% increase. That is pretty damn ground breaking if you ask me.

Seems like you are counter productive to your own argument there.

Edit:

I predict the stock market will not crash tomorrow.

You act as if it's a huge leap. This isn't a huge leap of faith to see this thing crashing and burning.

You should tell us the extra knowledge you would need to have over the rest of us in order to say this. At any rate, faith is not for the real world.
 
Last edited:
i speculated many months ago when Lucid was first brought up that it might coincide with Larrabee's launch....we shall see...
 
I predict the stock market will not crash tomorrow.

You act as if it's a huge leap. This isn't a huge leap of faith to see this thing crashing and burning.

I act like I'm waiting to see if it's worth a damn when I have some real numbers in front of me, this isn't coming from some random 3rd party we've never heard of.

Man you sure are touchy about something that isn't even out yet.
 
My call is this will never take off, or will have such issues it will drive customers away. I think of physx phsycal pcie card when I think of this...
 
My call is this will never take off, or will have such issues it will drive customers away. I think of physx phsycal pcie card when I think of this...

Except that the Psysx was a stand alone, this is built into MoBos, if it launching it would be dumb for companies to not put this into some of their boards in order to keep them competitive, especially if it takes off, unlike Psysx did.
 
I would love to see a hardware solution to multi-gpu. Software can be great but you don't always know what to expect.

I would be great to know exactly what to expect from a multi-gpu config for any given game/application without the if's and but's.

Anyhow, can't wait to see results/proof
 
I am looking forward to this, seems very creative, and it may or may not result in some groundbreaking combinations for something, can't hurt to try right?
 
Could potentially be very useful, however the graphic card vendors may instead just make it so any drivers simply do not work with the hydra, for whatever reason.
 
This thing will be settled in a few weeks, but it does sound a bit too good to be true
 
Talk about a dick move.

could you imagine the backlash if it comes out and is awesome then nvidia releases a new driver shortly after specifically to break it. I know I would think twice about buying another product from them. This is coming from someone who has bought nothing but Nvidia for the last 4 years.
 
could you imagine the backlash if it comes out and is awesome then nvidia releases a new driver shortly after specifically to break it. I know I would think twice about buying another product from them. This is coming from someone who has bought nothing but Nvidia for the last 4 years.

can you imagine anti-trust law suits?
 
Back
Top