Single PCB GTX 295 in the works

bigdogchris

Fully [H]
Joined
Feb 19, 2008
Messages
18,734
http://www.techpowerup.com/89158/Single-PCB_GeForce_GTX_295_in_the_Works.html

Single-PCB GeForce GTX 295 in the Works
Traditionally, NVIDIA designs dual-GPU accelerators with two PCBs holding a GPU system each. With the GeForce GTX 295 and its competitive pricing, NVIDIA found itself in a difficult position, as it faces direct competition from ATI with its now competitively priced Radeon HD 4870 X2. On the one hand, escalating manufacturing costs due to extreme competition with the sub-$300 graphics card market, is making it difficult for NVIDIA to keep up with GTX 295 stocks, on the other its repercussions that include bad press and losses due to not being able to keep up with demand, have pushed NVIDIA to rethink a way to make the GeForce GTX 295.

Enter innovation. The company is reportedly redesigning the GeForce GTX 295, this time on a single PCB design, which ATI has been using for its dual-GPU accelerators. Both GPU systems of the GTX 295 will be placed on a single PCB. This is expected to significantly bring down manufacturing costs, allowing the company to keep up with demands and competitive pricing. Expreview sourced the drawings of one of the prototypes, which shows a long single PCB card, with a central fan. You will also observe that there is a back-plate in place. It shows that a number of memory chips will be populated on the back, and both GPU systems on the front. It will be an engineering challenge, to populate five major heat-producing components (two G200b GPUs, two NVIO2 processors, and one BR-03 bridge chip), 28 GDDR3 memory chips, and the VRM area to power it all. The new redesigned card may surface internally in April, and may enter production by May.
Could explain part of the shortages. It's also going to be interesting to see the performance difference between the two.
 
The shortages are explained by nvidia taking a loss on each card sold so they aren't making as many as they could. This board will be a way for them to make money and have a proof of concept for their G300 chips.
 
Let see how they can do it. when it comes out, the 5870x2 already on the market.
 
Hey look, nVidia spending MORE time making rehash cards.

They're still going to be remaking the GTX 260/280 cards in 2011.

GTX 380 (2 GTX 280s in 1 card, 32nm process! But gimped cooling so it still runs at 95C)

Followed shortly by the G92 process GTX 350!
 
looks promising, but I rather have Nvidia try and come up with something new.
 
My 9800GX2 performance is MEH. I'm not saying it's a bad card, but multi-GPU is inefficient. Just look at the usage of the second GPU versus the first. Yeah, I'm not an engineer so I'm not saying I could do better, but as a consumer I feel like these "sandwich" cards don't deliver the value for the money I spend; I regret the purchase. I should have waited for the GTX280.
 
Enter innovation. The company is reportedly redesigning the GeForce GTX 295, this time on a single PCB design, which ATI has been using for its dual-GPU accelerators.

What they call innovation I call finally catching up with the competition. I'd consider getting this single PCB card if the price was right. I'll never own a dual GPU set up that isn't integrated into a single PCB, anything less is poor engineering.
 
It still needs more RAM per GPU imo, at least 1GB, I would even welcome more with more memory bandwidth as well.
 
It still needs more RAM per GPU imo, at least 1GB, I would even welcome more with more memory bandwidth as well.
yeah but you know that aint happening with the 448 bit bus. the next step up would be 1792mb per gpu for 3584 total. thats an expensive way to get 1792 of usable ram. too bad they didnt go with 512bit on the gtx295.
 
yeah but you know that aint happening with the 448 bit bus. the next step up would be 1792mb per gpu for 3584 total. thats an expensive way to get 1792 of usable ram. too bad they didnt go with 512bit on the gtx295.
just wait for the GTX299.5X2 Xtreme.
 
Yes, because the performance of their cards with GDDR3 is extremely low...:rolleyes:

Well imagine 448-bit or 512-bit GDDR5. It would be roughly twice as much bandwidth, with lower power costs. Of course the parts themselves would be somewhat more expensive. Now imagine NVIDIA taking the AMD route and cutting the bus down, say - 384-bit (8800GTX) with GDDR5. Still more bandwidth than AMD could give, with a lower cost and less complex PCB's than they currently have to deal with now.
 
Yes, because the performance of their cards with GDDR3 is extremely low...:rolleyes:

I fail to see your point. All I'm saying is that I'm actually more interested in seeing an actual PERFORMANCE increase, rather than just the same card again and GDDR5 could do this (it did for ATi). Of course, if you're happy with Nvidia churning out the same card over and over again...then to each his own i guess :rolleyes:.
 
NVIDIA needs to introduce a new card, not making refreshes and pointless rebranding and make it look "new", they have been doing this for a while now, whats the point? Guess they want to milk more money off from unsuspecting customers.
 
I agree that this is a pointless exercise in the eyes of the consumer, but my guess is that it's simply a trial run for nV because they have never done this before. The sandwich design is cack, so they are abandoning it in favour of actually applying some engineering know-how like ATi has been doing for some time. If they pull it off, I expect we'll see this technique applied to future multi-GPU cards from them. I just wonder how long this card is going to be if/when it sees the light of day!
 
Hey look, nVidia spending MORE time making rehash cards.

They're still going to be remaking the GTX 260/280 cards in 2011.

GTX 380 (2 GTX 280s in 1 card, 32nm process! But gimped cooling so it still runs at 95C)

Followed shortly by the G92 process GTX 350!

What a troll
 
Why do people complain about NVIDIA rehashing products? If the product still meets the market demand, what's the problem? Why are you pissed that NVIDIA was ahead of the curve with the introduction of the G92 8800GT/8800GTS?

Anyway, regarding the OP I'm not much of a fan of the new design. I like the sandwich designs because they are GREAT for water cooling (extremely efficient). While a single PCB might make it possible to incorporate larger heatsinks for each GPU while still retaining a two-slot design, the air will flow from one heatsink to another will mean that the cards will require much more airflow, and, unfortunately, probably a louder fan. Simply put, you need a much larger volume of air per minute to take advantage of such a design so that the air doesn't get saturated by the first heatsink before it gets to the second. Anyway, we will see.
 
Why do people complain about NVIDIA rehashing products? If the product still meets the market demand, what's the problem? Why are you pissed that NVIDIA was ahead of the curve with the introduction of the G92 8800GT/8800GTS?
Oh geez, here we go again.
Really there's already several multi-page debates about this exact issue. Everyone who has an opinion has already voiced it. Just look up the other threads and you'll find exact answers to that question.
 
Why do people complain about NVIDIA rehashing products? If the product still meets the market demand, what's the problem? Why are you pissed that NVIDIA was ahead of the curve with the introduction of the G92 8800GT/8800GTS?

Because nVidia keeps spending development time and resources on rehash cards, if I wanted a GTX 250, Id buy a 9800GTX+. People want nVidia to make a NEW card, one that advances the power of cards, rather than just throwing more Single GPU, High End cards that get outperformed by Dual GPU, people want to see advancement like a Single GPU card that starts pushing the boundaries. Kinda like moving up to the 8 series, the 8800 cards were groundbreaking. People want that advancement to continue, rather than rehash old cards.
 
they're having a hard time to make a good card to compare with ati. I hate sandwich cards dumping heat into the case.
 
If the product still meets the market demand

I demand new architectures and real performance increases, not shuffled designs and 1-2 FPS unnoticeable changes. The entire point of video cards (for the end user) is better performance in a cooler, smaller, cheaper and reliable package. You can compromise cool, small, and cheap for great performance, but doing the reverse--sacrificing performance for cheaper designs and greater profitability--only really translates to the HTPC market, and hurts your standing with the people for whom new video cards really matter (and who make the R&D behind new cards profitable.)

That's why I don't really like the revision-happy way things are going. A cheaper, more stable old card is still an old card. Until the performance is significantly better than what I have, and I have the reasonable expectation that it won't die within a year, I have no reason to buy.
 
do people who own 30" monitors buy gtx295's? IIRC it choked at that res because it didn't have enough memory for AA
 
do people who own 30" monitors buy gtx295's? IIRC it choked at that res because it didn't have enough memory for AA
it does have enough for most games. Far Cry 2 certainly wont allow 8x AA though as even 1024mb is not enough at 2560x1600. it is hilarious to see people running 2 gtx295 at 2560 because that really is a waste since you dont enough memory to increase AA for some games.
 
I demand new architectures and real performance increases, not shuffled designs and 1-2 FPS unnoticeable changes. The entire point of video cards (for the end user) is better performance in a cooler, smaller, cheaper and reliable package. You can compromise cool, small, and cheap for great performance, but doing the reverse--sacrificing performance for cheaper designs and greater profitability--only really translates to the HTPC market, and hurts your standing with the people for whom new video cards really matter (and who make the R&D behind new cards profitable.)

That's why I don't really like the revision-happy way things are going. A cheaper, more stable old card is still an old card. Until the performance is significantly better than what I have, and I have the reasonable expectation that it won't die within a year, I have no reason to buy.
And that's exactly what you should do, speak with your wallet. As it stands, NVIDIA is still offering a product that matches the competition. Not buying any new cards will tell them you want something more.
do people who own 30" monitors buy gtx295's? IIRC it choked at that res because it didn't have enough memory for AA
It's a load of crap, don't know where that rumor started. Having 128MB less VRAM makes little difference, the drivers and speed of the RAM have a much higher impact on the performance.
 
And that's exactly what you should do, speak with your wallet. As it stands, NVIDIA is still offering a product that matches the competition. Not buying any new cards will tell them you want something more.

It's a load of crap, don't know where that rumor started. Having 128MB less VRAM makes little difference, the drivers and speed of the RAM have a much higher impact on the performance.
having 128mb make very little difference? well in general yeah but in some cases if you go right over 896mb during a game then having more ram would be helpful. its like being at the store and coming up a dollar short at the checkout.

also where did he say 896mb vs 1024mb in the first place? I think he just made a general statement about 896mb not being enough at 2560 for high levels of AA in SOME games which is true.
 
Why? My 295 will still work just as well as it does today when the new revision comes out.

Because this redesign will ultimately be a better product than the squished cards. Will be interesting though to see if a 295 dies if you'd get a new designed one or an older one.
 
Because this redesign will ultimately be a better product than the squished cards. Will be interesting though to see if a 295 dies if you'd get a new designed one or an older one.
they probably are testing this out for future multi gpu setups. we may end up with similar high end setups from both Nvidia and ATI the next round. by that I mean the high end from both camps will be a single pcb with two gpus.
 
they probably are testing this out for future multi gpu setups. we may end up with similar high end setups from both Nvidia and ATI the next round. by that I mean the high end from both camps will be a single pcb with two gpus.

Most likely. I'm sure this is a dry run for the GT300 series coming out later this year.
 
having 128mb make very little difference? well in general yeah but in some cases if you go right over 896mb during a game then having more ram would be helpful. its like being at the store and coming up a dollar short at the checkout.
The problem with that analogy is that RAM isn't stagnant like change. If you had nickel's, dimes, and quarters rapidly flowing in and out of your pockets and all around, it's very easy to grab that extra $1, even if larger pants pockets would have made it easier in the first place (hmm... kind of felt like House there :p).

also where did he say 896mb vs 1024mb in the first place? I think he just made a general statement about 896mb not being enough at 2560 for high levels of AA in SOME games which is true.
His exact words:
do people who own 30" monitors buy gtx295's? IIRC it choked at that res because it didn't have enough memory for AA
And that rumor is a load of crap. I run 4x-16xAA in all my games with no problems, so the GTX 295 has more than enough RAM for AA. Now high AA is a different story (and it brings in a ton of variables, including specific game engines, drivers, and core design), but that's not what he said. Make sure you read what he said and not what you're thinking :cool:.
 
I fail to see your point. All I'm saying is that I'm actually more interested in seeing an actual PERFORMANCE increase, rather than just the same card again and GDDR5 could do this (it did for ATi). Of course, if you're happy with Nvidia churning out the same card over and over again...then to each his own i guess :rolleyes:.

The point is simple. The performance of their cards can't be in question, because they are on par or faster with GDDR3, than ATI's cards in the same price range.

GDDR5 is expensive and they won't use it in the GT200 based cards, because they already have enough memory bandwidth, given their bus size.
 
Oh geez, here we go again.
Really there's already several multi-page debates about this exact issue. Everyone who has an opinion has already voiced it. Just look up the other threads and you'll find exact answers to that question.

Which is also precisely why we don't more people complaining about what they call "rehashed" products...which is happening in this thread...
 
Back
Top