BFGTech GeForce GTX 295 @ [H]

nice subtle attempt at trying to start trouble

I'm not trying to start trouble, its just that it is even written in plain text that there is no noticeable difference in game play between a $500 and a $400 video card, and then 2 GTX 260s are still cheaper. The GTX 295 is like a scam for "enthusiasts" who don't have common sense. Sure, the enthusiast card is the top of the line card that is pricey, but awesome. This isn't an enthusiast's card, this is just an idiot buy.
 
I'm not trying to start trouble, its just that it is even written in plain text that there is no noticeable difference in game play between a $500 and a $400 video card, and then 2 GTX 260s are still cheaper. The GTX 295 is like a scam for "enthusiasts" who don't have common sense. Sure, the enthusiast card is the top of the line card that is pricey, but awesome. This isn't an enthusiast's card, this is just an idiot buy.
its an option for those that dont have SLI boards but you probably didnt think of that. also prices will drop because ATI has said it will be dropping its 4870X2 price.
 
The dual PCB cards interest me as I am looking to make an mATX rig in a couple months. It seems like they might be a nice way to get the power I need to run new games and not have to get an SLi mATX mobo (the GPUs take up all the card slots so no sound card).

One game I think it would be interesting to include would be WoW: WotWK as the graphics engine got some significant upgrades and many have noticed performance drops. Dalaran = slideshow.
 
Take some of the current benchmarks with a grain of salt, the gtx 295 is said to have premature drivers atm, so i'd wait a few weeks to see what performance gains can be squeezed out with driver updates.
 
Some of us "enthusiasts" got a screaming deal on a 280 thanks to Live cashback and rebates and already have $450 to trade up.
 
i didnt miss anything. the following is what i was referring to. the total difference between the gtx295 and gtx260 216sp sli setup is actually 48 shaders not 24

does it clarify in the review that it's a GTX260 Core 216 SLI setup? because from my understanding, it looks like a single GTX260 core 216 is even with the GTX295.

cause if it does, i must've missed it.

edit: nvm, looks like the GTX260 core 216 was cut from the review.
 
its an option for those that dont have SLI boards but you probably didnt think of that. also prices will drop because ATI has said it will be dropping its 4870X2 price.

Stop right there, hes not gonna get the logic your trying to bring into his head.
 
I hate to be so cynical but won't this thing will be EOL in three months just like the other GX2 cards? It's not rewarding for the customers who buy these expensive cards when you introduce a new architecture so soon after releasing a very expensive product. Isn't Nvidia bringing in a new architecture in the spring\early summer? As an long time Nvidia owner I would grab a 4870X2 over this.
 
I am happy with my purchase of 4870X2s in CF. I've had them since release and it's nice to see that they compete so well. Catalyst 9.1 is said to introduce something nVIDIA have had and ATi not (Quad Core support in drivers). This should make the 4870X2 perform near identical to the GTX295 in CPU limited scenarios (lower resolutions) where now it loses badly.
 
This should make the 4870X2 perform near identical to the GTX295 in CPU limited scenarios (lower resolutions) where now it loses badly.

Performance at lower resolutions should not worry those who buy ultra high-end video cards like the GTX295 and the 4870X2.
I am far from being impressed by the GTX 295 and I will stick with my 4870X2 for now.
 
:confused: ....... I don't know what's going on here but the 4870x2 get mopped up by the 295 GTX. On every other review not preview I've read
 
Okay, cause I just bought a 4870x2 to go with my new XHD3000, now I'm not sure if I should send back for the GTX 295
 
The only difference between GTX260 SLI & GTX 295 is that GTX 295 has more stream processors (times 2) than GTX 260 in SLI.

GTX 295 = 240 + 240 = 480
GTX 260 SLI = 216 + 216 = 432

Correct me if I'm wrong.
 
The only difference between GTX260 SLI & GTX 295 is that GTX 295 has more stream processors (times 2) than GTX 260 in SLI.

GTX 295 = 240 + 240 = 480
GTX 260 SLI = 216 + 216 = 432

Correct me if I'm wrong.
thats correct plus the corresponding TMUs with those stream processors.
 
Even though we just got a big batch of new games this past fall, it seems as though, once again, we are left wanting for games to actually challenge this high-end hardware.

It is difficult to be enthusiastic for a product that will essentially give you just another notch of AA. I suppose folders may find more value in it, but it is a gaming video card first, and for that purpose, it is decidedly ho-hum.

well, there is crysis warhead, farcry 2, stalker clear sky, where this card is basically a necessity for the basic 4xAA 16xAF and a high res with the max settings
 
I hate to be so cynical but won't this thing will be EOL in three months just like the other GX2 cards? It's not rewarding for the customers who buy these expensive cards when you introduce a new architecture so soon after releasing a very expensive product. Isn't Nvidia bringing in a new architecture in the spring\early summer? As an long time Nvidia owner I would grab a 4870X2 over this.

you would grab a worse card because the better card is going to be outdated?
that makes no sense

i think both architectures are on their way out anyway, so it kinda cancels each other out.
 
Everything has been said, so I won't repeat.

But geez for such a monster card who decided to crap out on the design of the shell? LOL
 
thats correct plus the corresponding TMUs with those stream processors.

Two things need to be understood about the GT200 based cards (and G80/92 based for that matter).

GT200 is said to have 80TMUs, which is true... unless you enable anisotropic filtering. As soon as you do you drop down to 40TMUs (the same with and Floating Point Texturing such as HDR and Bloom Effects).

And the RBE (Render Back End) the GT200 based cards possess so much brute force in ROps. This is nullified when true 8xAA (8samples) is used. This is due to inferior Z Stenciling. Therefore with 8xAA (not CSAA) a GT200 based card will generally lag behind an RV770 based card in the same price point.

This is even more amplified when AF is used as it nullifies the GT200s biggest strength (it's texturing power).

Shader Calculation power is 1.5x to 2x faster on RV770 than it is on GT200 (Shaders not limited by texturing that is).

So all in all.. GT200 is a stop gap architecture. nVIDIAs next full new architecture (GT300) should resemble RV770 morso than GT200/212. This is because DX11 brings hardware tessellation and the need for pure ALU performance over Texturing performance.

For those who purchased RV770 based cards. You will be able to enjoy newer games coming out for a bit longer but not much. Much like x19x0 vs. 79x0 with x19x0 cards outperforming 79x0 cards as time went on and newer titles were released.

RV870 vs. GT300 is where it's at.
 
Interesting points you have there Eimols, but as usual Im sure that will pretty much get pushed aside as irrevelant by some here
 
Is there a review coming up for the GTX 285?

i'd like to see how it stacks up.

but i loved the review, it just sucks that we seem to get the best of the hardware at the end.

Are we done of the days where we get a card that is the champ for over a year like the 8800GTX/Ultra.

I'm probably gonna wait to jump from my 280, but is it wrong to have hope that the GT300 will be that good?
 
Is there a review coming up for the GTX 285?

i'd like to see how it stacks up.

but i loved the review, it just sucks that we seem to get the best of the hardware at the end.

Are we done of the days where we get a card that is the champ for over a year like the 8800GTX/Ultra.

I'm probably gonna wait to jump from my 280, but is it wrong to have hope that the GT300 will be that good?
the gtx285 doesnt come out until next week on the 15th. the GT300 could be 6-9 months away. dont get on the never ending waiting merry go round.
 
Interesting points you have there Eimols, but as usual Im sure that will pretty much get pushed aside as irrevelant by some here

It would appear as such.

I still find it interesting how ATi had gone for a 4:1 ratio of ALU:TEX and nVIDIA had initially gone 2:1 only to switch to 3:1 with the GT200.

This indicates that even nVIDIA are aware that Texturing is losing its significance and that ALU power will dictate future gaming performance.

I am really, really curious to see what nVIDIA has in store for us with GT300. GT212 looks to be a product that we ought to all ignore seeing as it's usefulness will be short lived.

On the other end RV870 seems to be a continuation of R600/RV670/RV770 but with added DirectX11 bells and whistles and some more ALU performance. If this is the case than the GT300/RV870 battle will be one of ingenuity on one end (nVIDIA) vs. experience on the other (AMD/ATi). Finally because I was getting bored.
 
Kyle or any one who has an GTX295 what is the folding performance of this card (per GPU and total card) the power pull on these seem also alot higher then 9800GX2 cards 700W for just 2 cards in the system (GTX295)
 
I have dual 8800 Ultras right now, which I bought 1.5 years ago. Would getting one of these gtx 295's be a significant improvement for me at 1920x1200?

Rest of my specs:

680i SLI A1
QX6850 @ 3.33ghz
4gb pc-1066 ram
1000w power supply
 
Has it been said why Nvidia doesn't support 10.1 on the two new released cards? Guess they could add support later?
 
Has it been said why Nvidia doesn't support 10.1 on the two new released cards? Guess they could add support later?

The usefulness of DX 10.1 is pretty limited. To my knowledge, STALKER Clear Sky and Assassin's Creed are the only two games to support it. AMD claims that there are several games forthcoming to support it, but have been less than crystal clear about what advantages DX10.1 is going to bring to those games. And the problem with relying on future games to justify a feature is that 80% of started game projects don't finish. So there is no guarantee that the games AMD promised will take advantage of the DX10.1 will ever see the light of day.

I'm with NVIDIA on this one. All it ever was was an advertising bulletin, in my opinion. And now that Windows 7 and DX11 is within sight, there is even less impetus to support it.

Edit: DX10.1 does seem to be providing tangible advantage to ATI users in STALKER Clear Sky. And that's great, if you are still playing that game. If not, you may find it's usefulness somewhat less than stellar.
 
The usefulness of DX 10.1 is pretty limited. To my knowledge, STALKER Clear Sky and Assassin's Creed are the only two games to support it. AMD claims that there are several games forthcoming to support it, but have been less than crystal clear about what advantages DX10.1 is going to bring to those games.

I'm with NVIDIA on this one. All it ever was was an advertising bulletin, in my opinion. And now that Windows 7 and DX11 is within sight, there is even less impetus to support it.

IMO DX10.1 did bring quite a few benefits, still, nvidia did right skipping it and going for DX11 instead.

Reminds me of the different flavors of DX8 and DX9 that were not supported either.
 
Would be nice to see an image comparing 16xCSAA on GTX295 to 8xMSAA on 4870x2 in Fallout 3. I was never convinced with the CSAA quality difference over MSAA...
 
While the Official Launch Date for the 285 is a little bit away, the cards can easily be ordered from distributors. (I'll have three in on Monday). What I would like to see is a match up with 1 single card of each of the following:

4870x2
GTX280OC edition (such as the BFG OC2 or OCX)
GTX295
GTX285

That I think would be the most important for the average reader, then for fun do a test with

SLI 2x GTX295
SLI 2x GTX285
SLI 2x GTX280OC
Crossfire 2x 4870x2

I guess I could try running some tests myself on them as come Monday I will have just enough cards for system builds to try all those combinations, but I happen to stink at doing benchmarks so best let somebody else do that part.
 
The usefulness of DX 10.1 is pretty limited. To my knowledge, STALKER Clear Sky and Assassin's Creed are the only two games to support it. AMD claims that there are several games forthcoming to support it, but have been less than crystal clear about what advantages DX10.1 is going to bring to those games. And the problem with relying on future games to justify a feature is that 80% of started game projects don't finish. So there is no guarantee that the games AMD promised will take advantage of the DX10.1 will ever see the light of day.

I'm with NVIDIA on this one. All it ever was was an advertising bulletin, in my opinion. And now that Windows 7 and DX11 is within sight, there is even less impetus to support it.

Edit: DX10.1 does seem to be providing tangible advantage to ATI users in STALKER Clear Sky. And that's great, if you are still playing that game. If not, you may find it's usefulness somewhat less than stellar.

nVIDIAs lack of DX10.1 is a political move. They don't have the architecture to support it. Their GT200 based cards with their 240SPs lack the mathematical ALU power to perform Hardware tessellation (just look at it's lackluster Dynamic Branching, Vertex Shading or Geometry shading performance as an indicator).

If nVIDIA would move to DX10.1 games would replace conventional MSAA with Hardware Tessellation to smooth out jaggies. What would than occur is near free AA on AMD RV770 based cards and no need for GT200s powerful RBE (ROps performance).

DX10.1 gives RV770 based cards a clear advantage over GT200.
 
This card honestly doesn't look bad at all, especially for those who cannot run an SLi setup or want higher performance with lower power usage. It appears to edge out the 4870x2 by a sizable margin in most other benchmarks. Granted, at $100 less, the 4870x2 is probably the better deal.
 
Would be nice to see an image comparing 16xCSAA on GTX295 to 8xMSAA on 4870x2 in Fallout 3. I was never convinced with the CSAA quality difference over MSAA...

Well it actaually is less quality than MSAA cause it uses less color/z samples as the MSAA equivalent. It is arguable whether the difference is noticeable though using the same coverage samples.
 
you would grab a worse card because the better card is going to be outdated?
that makes no sense

i think both architectures are on their way out anyway, so it kinda cancels each other out.

No it doesn't cancel each other out. The 4870X2 is cheaper and was released months ago. AMD isn't releasing a new architecture 3 months after the release of a $500 card. Plus this card looks slapped together (like the last few GX2's) and doesn't warrant the higher price for a slight increase in performance. ATI will be dropping the price even lower too. Once again ATI will win the price performance battle while Nvidia sticks to it's expensive monolithic design with a 448 bit bus. ATI has Nvidia in a panic and this card is proof they are being simply reactive instead of proactive. I will await Nvidia's next architecture with great anticipation.

Will they continue down the same path of 512 bit memory bus and gigantic dies or will they smarten up and realize if they continue this way they will hemorrhage the last 4 years worth of profits.
 
Back
Top