nVidia to drop SLI?

If this is true, hopefully it means that Nvidia opens SLi to Intel's chipsets. I would love to have one board that is capable of SLi or Crossfire if I so choose and doesn't require hacking drivers, spending $4k on a dual socket system (I'm talking to you Skulltrail!!!) and gives me the reliability of an Intel chipset.
 
I had nothing but problems with my 7900gt sli setup, so much so that when I upgraded my system I went with a single card mobo.
 
ya the skulltrail shows its possible but for a price.

even if it is single socket multi-GPU i would prefer it over 2 cards, and obviously it is done with not too much trouble.
 
Great news! Nvidia's last good chipset was the nForce4. Only reason they restrict SLi to nvidia chipsets is to milk money out of people who wouldn't normally buy their crap chipsets.
 
According to xbitlabs.com, nVidia has failed to secure a license for Intel's next-gen CSI bus. Read the article here:

http://www.xbitlabs.com/news/video/display/20080222132030_Nvidia_s_SLI_May_Disappear_or_SLI_Policy_May_Be_Changed_If_Nvidia_Fails_to_License_Intel_s_Next_Gen_Processor_Bus.html

Does this mean the end of SLI?

It means nvidia will not be able to design a chipset supporting Intels CSI bus, SLI is irrelevant in this case since its independant of the north/southbridge aka Skulltrail.
 
It means nvidia will not be able to design a chipset supporting Intels CSI bus, SLI is irrelevant in this case since its independant of the north/southbridge aka Skulltrail.

It's not irrelevant because nVidia only allows SLI to run when using it's chipsets. Skulltrail gets around this by using nVidia chips on the board. To continue to support SLI with Intel chips, they will either need to get a license from intel or open up SLI to non-nVidia chipsets.

This will also hurt their IGP business especially since they have started talking about SLI with IGP to boost gaming performance with low end machines and video cards.
 
If this is true, hopefully it means that Nvidia opens SLi to Intel's chipsets. I would love to have one board that is capable of SLi or Crossfire if I so choose and doesn't require hacking drivers, spending $4k on a dual socket system (I'm talking to you Skulltrail!!!) and gives me the reliability of an Intel chipset.
Technicality: Asus P5N32-E SLI, Striker, Striker Extreme are capable of SLI and Crossfire when flashed with the HP Blackbird BIOS.
 
There's no technical or electronic reason why pretty much any board that has two PCI-E slots couldn't use both SLI and Crossfire, it's just an artificial lock-out. If you want to use SLI, Nvidia would you like you to also buy one of their chipsets, thanks very much. That Intel 'had' to put Nvidia chips on Skulltrail to enable SLI is smoke and mirrors.
 
There's no technical or electronic reason why pretty much any board that has two PCI-E slots couldn't use both SLI and Crossfire, it's just an artificial lock-out. If you want to use SLI, Nvidia would you like you to also buy one of their chipsets, thanks very much. That Intel 'had' to put Nvidia chips on Skulltrail to enable SLI is smoke and mirrors.

True but you have to remember SLI is nvidia's technology and they can charge for it. I dont know what the premium is on those NF100 chips but Intel is paying a hefty price.
 
True but you have to remember SLI is nvidia's technology and they can charge for it. I dont know what the premium is on those NF100 chips but Intel is paying a hefty price.

It is, but the technology needed isn't on the motherboard, it's on the cards themselves; the motherboard only needs to provide PCI-E peer-to-peer transfers. But as you say, Nvidia have the right to license it, and the cost of the NF100 chips is a licensing fee, I guess.
 
bye bye nvidia chipset business. Nice play Intel!

+ with nvidia not playing nice with open SLI and Intel planning to get serious into GPU development why should they give nvidia anything. only 2% of gamers use dual-gpu and 90% probably use Intel CPUs.
 
There's no technical or electronic reason why pretty much any board that has two PCI-E slots couldn't use both SLI and Crossfire, it's just an artificial lock-out.
Sometimes and sometimes not. I have a couple of dual x16 graphics slot boards with very limited Crossfire support (X800 and lower) that don't officially support SLI, but work with the hacked nvidia drivers.

ATI's CF requirements are that the chipset support peer writes, which not all chipsets do. Plus the BIOS needs to support certain tweaks expected by the drivers to enable CF. There are 3 parts that need to work together: chipset, drivers and BIOS. And it's not as simple as making a small tweak for the motherboard manufacturer-- they also have to support it, including fixing glitches.

------
If the story is true, good to Intel for holding out for a SLI cross-license. But since it was sourced by theinq (which hates nvidia) with no other confirmation, it's about as reliable as a 10% chance of rain. :p The premise that nvidia is dropping SLI is pretty stupid.

edit: here is theinq claiming that nvidia already got a CSI license from Intel: http://www.theinquirer.net/en/inquirer/news/2007/05/09/what-nvidia-got-from-intel How can the inq ever be wrong when they claim a and not a? :rolleyes:
 
I went SLi because I thought it would be neat I now pay the price. A single video card is by far a more elegant solution to gaming or computing in general. I have had nothing but trouble since my SLi intsall. To top it off the 680i chipset doesn't play nice with Intel Quad Cores when trying to oc. That's the whole reason I got the Q6600, what a pain in the butt. Now I have two midrange gfx cards instead of one top of the line card, oi. I say go P35, or X38, let the nVidia chipsets die, or at least help to kill them. Just say no to SLi.
 
Well i remember people using nvidia SLI on ATI RD600 intel mobo with hacked drivers so the lock is only artificial.
 
While I prefer Intel's chipset over NVIDIA, I think that competition is a good thing.
I hope that NVIDIA will remain a player in the chipset market.
 
I as well am still gaming on an AMD, I'm waiting til I have more money to see who wins the performance crown.
 
Stopped reading when they sourced The Inq.

You know i get SO sick of hearing this, considering JUST how many times the INQ is right with their info, but all you hear is when they get something wrong, make something up....

it would itnerest me how many of their article's they have, were true in the end compared to how many were off, most any site that comes out with info early on before a source is from the maker is wrong.... but how wrong is what matters to me.
 
Well this certainly explains why Nvidia is releasing reports saying dual gpu sucks right before releasing the GX2 :D
 
While I prefer Intel's chipset over NVIDIA, I think that competition is a good thing.
I hope that NVIDIA will remain a player in the chipset market.

Agreed, but I would REALLY like to see Intel chipsets with SLI capability without having to go through hoops to get it to work. Either that, or have AMD come out with some kick ass GPU's and use CF instead.
 
You will be suprised about how many people still game on amd machines.

People like me...

Heheh, yeah. I'm still using AMD Haven't had a chance to upgrade since Intel took back the performance crown.

I as well am still gaming on an AMD, I'm waiting til I have more money to see who wins the performance crown.

Well, according to "reliable" sources, Fudo reports Shanghai will have a 10-20% clock for clock performance increase compared to Barcelona and Agena. So I'm going to wait it out and see if those numbers are true, and if so, I'm on a Shanghai and an AM2+ motherboard like white on rice. If it's a bust, I'll buy Intel. The burden of proof lies with AMD now...
 
You know i get SO sick of hearing this, considering JUST how many times the INQ is right with their info, but all you hear is when they get something wrong, make something up....

it would itnerest me how many of their article's they have, were true in the end compared to how many were off, most any site that comes out with info early on before a source is from the maker is wrong.... but how wrong is what matters to me.

QFT. They are right more often than not. Like about taking top-end Xeon chips with extra cache and labeling them P4 Extremely Expensive editions...
 
Back
Top