8600GTS 3DMK06 claim

but we are going from this gen midrange to next gen midrange. i would not consider a card in th range of a 7900 gs to be high end.

You may not, but it is nonetheless, it is basically the bridge point between the 7600 and 7900 line. The lowest of the 7900 line with performance not achievable by any of the 7600's stock wise.

Nonetheless, a 7900 GS will still be outperformed by the 8600 GTS, just like how the 6800 GS was outperformed by the 7600 GT. Not 2x as fast mind you which is unrealistic, but a comfortable boost.
 
Maybe so but the 6800 and 7600 were very close architecturally. And it looks like any fears about 8600GTS overclocking are unwarranted based on what we've heard so far. I would be very surprised if the 8600GTS doesn't stomp all over the 7900GS at settings that are actually playable on both cards. The efficiency of the core and single-cycle 4xAA of G80 could well make up for the 128-bit bus deficit. Not to mention that the 8600GTS probably has > 2x the texture filtering power of G70 minus the texture crawling and filtering optimizations. I think you will get higher performance but you're obviously paying for the other improvements as well. There will be certain things the 7900GS just cannot do that the 8600GTS will.

From what I've seen so far, you guys seem to be overestimating what 128-bit can do. 256-bit width will provide for better performance, and by more than you may think. The 6800GS and the 7600GT are both architecturally similar, but the 6800GS can play games at the same settings as the 7600GT smoother at higher resolutions. Apparently half of you guys haven't used a 7600GT and a 6800GS side by side before. There's an apparent difference, and you can tell.

Overclocking your graphics card is about as easy as it gets. If you buy a 6800GS and don't overclock it, it's like buying a Pentium D 930 with a high end motherboard and not overclocking that either... I think it's quite retarded to buy a 6800GS and leave it stock. The 6800GS has more clockable headroom than the 7600GT, and I'd pick up a 6800GS over my 7600GT any day. Goes to show the amazing power of high clocks over memory interface size...

Despite how many of you guys are in denial, that 128-bit will hurt the card, no matter at what clocks it's at. That 128-bit bus has almost changed my mind on whether I should get an 86GTS or an 88GTS, and I'm seriuosly considering pulling out a little more money and getting a 320-bit bus card. I've had it with 128-bit. Hell, my X800 had 256-bit...........
 
From what I've seen so far, you guys seem to be overestimating what 128-bit can do. 256-bit width will provide for better performance, and by more than you may think. The 6800GS and the 7600GT are both architecturally similar, but the 6800GS can play games at the same settings as the 7600GT smoother at higher resolutions. Apparently half of you guys haven't used a 7600GT and a 6800GS side by side before. There's an apparent difference, and you can tell.

Overclocking your graphics card is about as easy as it gets. If you buy a 6800GS and don't overclock it, it's like buying a Pentium D 930 with a high end motherboard and not overclocking that either... I think it's quite retarded to buy a 6800GS and leave it stock. The 6800GS has more clockable headroom than the 7600GT, and I'd pick up a 6800GS over my 7600GT any day. Goes to show the amazing power of high clocks over memory interface size...

Despite how many of you guys are in denial, that 128-bit will hurt the card, no matter at what clocks it's at. That 128-bit bus has almost changed my mind on whether I should get an 86GTS or an 88GTS, and I'm seriuosly considering pulling out a little more money and getting a 320-bit bus card. I've had it with 128-bit. Hell, my X800 had 256-bit...........

Shader power increases performance alot more then increasing the bus width would on mainstream video cards, and it's not unatural for newer 128Bit cards to overpower older 256Bit cards.

http://techreport.com/reviews/2006q1/geforce-7600-7900/index.x?pg=8

Your experience is not something that has been proven by any of the respective website that claim that the 7600 GT is a quicker card, for most resolutions people play at. Once you get to the levels that the memory bandwidth actually plays an issue, your running into FPS levels that are pretty much unplayable.

That's you for most who don't overclock the 7600 GT is a better choice as it is faster out of the box and runs much cooler then the 6800 GS.

There is also the issue of again the memory speed difference, the 8600 GTS with 2GHZ memory will have 70% of the bandwidth levels of the 7900 GS. What matters is shader power and the 8600 will increase that tremendously, memory bandwidth is somewhat less, but at 12x10 resolutions it won't run into bottlenecks which is what this card was designed to do.

If I was buying a video card to overclock it the 6800 GS might be the better choice, but for most it is not.

Any deficiency by the 30% reduction in bandwidth will be compensated by the increase in shader power which has a larger impact on performance then memory bandwidth does.

If you want to game at high resolutions then you need an 8800 Series card, Nvidia never designed the 8600 with high resolution gaming in mind, but adequate performance for mid range levels which were the realm of the high end last generation. You want high resolution gaming you go to the 8800 Series, with it's wider memory interfaces.
 
From what I've seen so far, you guys seem to be overestimating what 128-bit can do. 256-bit width will provide for better performance, and by more than you may think. The 6800GS and the 7600GT are both architecturally similar, but the 6800GS can play games at the same settings as the 7600GT smoother at higher resolutions. Apparently half of you guys haven't used a 7600GT and a 6800GS side by side before. There's an apparent difference, and you can tell.

Overclocking your graphics card is about as easy as it gets. If you buy a 6800GS and don't overclock it, it's like buying a Pentium D 930 with a high end motherboard and not overclocking that either... I think it's quite retarded to buy a 6800GS and leave it stock. The 6800GS has more clockable headroom than the 7600GT, and I'd pick up a 6800GS over my 7600GT any day. Goes to show the amazing power of high clocks over memory interface size...

Despite how many of you guys are in denial, that 128-bit will hurt the card, no matter at what clocks it's at. That 128-bit bus has almost changed my mind on whether I should get an 86GTS or an 88GTS, and I'm seriuosly considering pulling out a little more money and getting a 320-bit bus card. I've had it with 128-bit. Hell, my X800 had 256-bit...........

I was on the "architecture is more important than memory bus" side of the fence, but your well-made points based on hands-on experience have definitely shifted my perspective. Not that I'll quit hoping that the 8600s are little giant-killers, but good post.

And like you say, one solution is to get both better architecture and wider bus by moving up the ladder a little. It's just such a shame that the 8800GTS 320 has all the power of the big boys but can't quite use it because of the sub-512MB frame buffer. Although, IIRC, there are some new "lower high-end" cards coming with more memory--8900GS or GT or something?
 
I was on the "architecture is more important than memory bus" side of the fence, but your well-made points based on hands-on experience have definitely shifted my perspective. Not that I'll quit hoping that the 8600s are little giant-killers, but good post.

And like you say, one solution is to get both better architecture and wider bus by moving up the ladder a little. It's just such a shame that the 8800GTS 320 has all the power of the big boys but can't quite use it because of the sub-512MB frame buffer. Although, IIRC, there are some new "lower high-end" cards coming with more memory--8900GS or GT or something?

When you do that though, you up the cost you are paying in the end, so having your cake and eating it too isn't free. There currently isn't enough data on when the 8900 Series if it even exists or when it will be arriving, Nvidia may just skip to the Geforce 9 Series ala the 6800 to 7800.

To really game at high resolution with minimal bottlenecks you need to shell out the cash for the 8800 GTS 640.

You also got to keep in mind, it won't be that often where the 8800 GTS 320 outperforms the 8600 GTS at more standard resolutions, certainly not a 40-50% gap, as should happen considering their MSRP's.
 
When you do that though, you up the cost you are paying in the end, so having your cake and eating it too isn't free. There currently isn't enough data on when the 8900 Series if it even exists or when it will be arriving, Nvidia may just skip to the Geforce 9 Series ala the 6800 to 7800.

To really game at high resolution with minimal bottlenecks you need to shell out the cash for the 8800 GTS 640.

You also got to keep in mind, it won't be that often where the 8800 GTS 320 outperforms the 8600 GTS at more standard resolutions, certainly not a 40-50% gap, as should happen considering their MSRP's.

MSRP of a 256mb 8600GTS is around 200 I think. I can find an 8800GTS with rebates for 240-250. I think 50 dollars more for 320bit over 128bit is a pretty good trade off. Plus, an 8800GTS scores 9000 on 3dmark06 stock

http://www.xsreviews.co.uk/reviews/graphics-cards/msi-8800gts-320mb/5/

and the 8600GTS seems to score around 6000 on 3dmark06

http://www.xtremesystems.org/forums/showpost.php?p=2086617&postcount=29

quite a gap for 50 dollars extra, wouldn't you say?

Not to offend you, but looking at your system in your signature, do you have extensive experience with higher end//mid range systems and graphic cards?
 
From what I've seen so far, you guys seem to be overestimating what 128-bit can do. 256-bit width will provide for better performance, and by more than you may think.

Of course a 256-bit bus would provide for better performance in certain scenarios. But I think you are also overestimating the advantage that a wider bus would bring to a part like the 8600GTS. It's about balance - how much faster on average would the card be after doubling the bus?

There's also the cost argument of course. Just because aging 256-bit chips are cheap at EOL it doesn't mean that new architectures can roll out 256-bit chips at the same low price point. Those EOL architectures have already paid for themselves so a little inventory clearing is fine at the end. I think the $200 segment is going to stay at 128-bit until one IHV bites the bullet and goes higher because they are forced to. From what I can see a 128-bit bus serves well at the resolutions where these cards belong.
 
MSRP of a 256mb 8600GTS is around 200 I think. I can find an 8800GTS with rebates for 240-250. I think 50 dollars more for 320bit over 128bit is a pretty good trade off. Plus, an 8800GTS scores 9000 on 3dmark06 stock

http://www.xsreviews.co.uk/reviews/graphics-cards/msi-8800gts-320mb/5/

and the 8600GTS seems to score around 6000 on 3dmark06

http://www.xtremesystems.org/forums/showpost.php?p=2086617&postcount=29

quite a gap for 50 dollars extra, wouldn't you say?

Not to offend you, but looking at your system in your signature, do you have extensive experience with higher end//mid range systems and graphic cards?

Nice try poking fun at the system in my signature only further weakens your argument, invalid comparison to say the least, MSRP of the 8800 GTS is 299USD, MSRP of the 8600 GTS is 199USD. You will have to wait for normalized pricing between the two's actual retail pricing before you can compare.

It's also exactly where I expected it to be 7950 GT level performance for mainstream prices, considering the MSRP of the 7950 GT was 299USD, this is just about right.

Let me explain why this is the case, the 6800 Ultra was released back in April 2004, while the new mid range performer that outperformed it for only MSRP $199US came out in March 2006. ~ 23 months later.

It is now almost April 2007 almost 11 months since the initial launch of the 7900 GTX and about 22 months after the launch of the 7800 GTX 256, and the new mid range performer to be launched within the next 3 months give or take should outperform the 7800 GTX 256, and given the 7950 GT performance levels indeed it does.
 
Of course a 256-bit bus would provide for better performance in certain scenarios. But I think you are also overestimating the advantage that a wider bus would bring to a part like the 8600GTS. It's about balance - how much faster on average would the card be after doubling the bus?

There's also the cost argument of course. Just because aging 256-bit chips are cheap at EOL it doesn't mean that new architectures can roll out 256-bit chips at the same low price point. Those EOL architectures have already paid for themselves so a little inventory clearing is fine at the end. I think the $200 segment is going to stay at 128-bit until one IHV bites the bullet and goes higher because they are forced to. From what I can see a 128-bit bus serves well at the resolutions where these cards belong.

Sometimes you get cut down high end cards at the 199USD price point like X1800 GTO, X800 GT, as well as the 7900 GS.

But for the mainstream core, I agree they will stick to a 128Bit interface, while a performance-mainstream variant, like the NV42 or the RV670 will likely remain the cards that get the wider memory interfaces.
 
Nice try poking fun at the system in my signature only further weakens your argument, invalid comparison to say the least, MSRP of the 8800 GTS is 299USD, MSRP of the 8600 GTS is 199USD. You will have to wait for normalized pricing between the two's actual retail pricing before you can compare.

It's also exactly where I expected it to be 7950 GT level performance for mainstream prices, considering the MSRP of the 7950 GT was 299USD, this is just about right.

Let me explain why this is the case, the 6800 Ultra was released back in April 2004, while the new mid range performer that outperformed it for only MSRP $199US came out in March 2006. ~ 23 months later.

It is now almost April 2007 almost 11 months since the initial launch of the 7900 GTX and about 22 months after the launch of the 7800 GTX 256, and the new mid range performer to be launched within the next 3 months give or take should outperform the 7800 GTX 256, and given the 7950 GT performance levels indeed it does.

I mentioned your system because you do not seem to have first hand experience between using 128-bit bus cards and 256-bit. Unless you can tell me first hand how 128-bit cards handle in normal games compared to cards with larger bandwidths, I don't see how you can even be arguing this. A 7600GT has high clocks, but being limited by 128-bit still hurts the performance by a decent amount, not just the small amount you may be referring to.

Ah yes, it is a MIR, and the price before it is 280 - 300, as you have noted. That is correct. I will not try to hide the fact. But when I look for graphic cards, personally, I am willing to fork out more dollars if I know that there is more potential in a card, and that it is not hindered by silly bottlenecks such as the 128-bit bus. But according to your argument, when nvidia releases the rest of the 8 series on April 17th (or whenever they are going to release it), it's a pretty good bet that the 8800GTS prices will drop as well. So I guess we'll just have to wait, huh? :)

Also, you must remember the X1600 fiasco. The card benched really well and beat a 7600GT in 3dmark05, but when set under real world conditions, the 7600GT slaughtered the X1600 in every other benchmark. That 128-bit bus may bench well, but let's see the real world performance. I am already feeling the affects with my 7600GT. Scores around 7k on 3dmark 05, but I still get intermittent lag and I have to run lower resolutions than if I used a 6800GS, which can run the same settings at a higher resolution with smoother gameplay. The 7950GT will have a larger memory bandwidth, allowing you to play at higher resolutions while maintaining the same settings, and having a decent amount of pipes. The 8600GTS will have higher clocks, but be held back by the 128-bit bus, which is detrimental to anyone who wants to play at higher resolutions.

By the way, your MSRP style of trying to determine older card's price to performance ratios are off. You can purchase 7950GT's for around 220 without rebates or anything. eVGA still has their 6800GS rated at an MSRP of around 230, but I can find them for around 100 - 130 dollars. Close to MSRP, right?
 
So a 8600GTS will perform simular to a 7900GT but with a little better graphic and a bit lower power consumption?
Is that a correct conclusion?
 
thing is, at the common ws resolution, there wont be an upgrade, due to the memory bus restrictions. essentially we would be paying just for dx10, not for an upgrade in performance.
If you want an upgrade in performance, spend the money. If you want DX10, get one of these and spend less.

Either way, you are asking essentially for something for nothing, which just doesn't happen.
And Im not sure what you are bitching about. Nobody expects the 8600GT to destroy a 7900GT. Thats the job of the G80, not the G84/G86 whatever the fuck it is.
 
If you want an upgrade in performance, spend the money. If you want DX10, get one of these and spend less.

Either way, you are asking essentially for something for nothing, which just doesn't happen.
And Im not sure what you are bitching about. Nobody expects the 8600GT to destroy a 7900GT. Thats the job of the G80, not the G84/G86 whatever the fuck it is.

so if someone expects an upgrade when they go from a midrange previous gen card to a midrange current gen card, they are wrong?
 
I own both a eVGA 6800 GS and a PNY 7600 GT. I will say that the 6800 GS certainly feels smoother at higher resolutions with AA. I was running Fear at 1280x1024 and the 6800 GS pumps out a steady fps and is smooth. The 7600 GT will a lot of the times give higher frame rates but with lots of action on screen the frames plummet as low as 15 (Something which my 6800 didn't do). I don't know how much 256 bit will affect the new gen cards but the difference can definitely be felt on the 7600 GT and 6800 GS.

BTW, I was planning to upgrade to the 8600 GTS but I don't wanna deal with this 128bit BS so I think I may just go 8800 GTS 320.
 
I don't know how you guys are coming up with the conclusion that the 6800gs beats the 7600gt. I currently run a 7600GT and had a 6800gs before unlocked to 16 pipes and overclocked to ultra levels. So basically we're comparing an 6800 ultra with a 7600 GT (overclocked to levels below).

Benchmarks at many reputable sites have the 7600GT doing better than the 6800ultra, not to mention the 6800 gs at stock.

I run the 7600GT at 1680x1050 in games compared to 1280x1024 for the 6800GS at much smoother framerates (i am addicted to the fraps framerate corner).
 
I don't know how you guys are coming up with the conclusion that the 6800gs beats the 7600gt. I currently run a 7600GT and had a 6800gs before unlocked to 16 pipes and overclocked to ultra levels. So basically we're comparing an 6800 ultra with a 7600 GT (overclocked to levels below).

Benchmarks at many reputable sites have the 7600GT doing better than the 6800ultra, not to mention the 6800 gs at stock.

I run the 7600GT at 1680x1050 in games compared to 1280x1024 for the 6800GS at much smoother framerates (i am addicted to the fraps framerate corner).


Your 6800 GS was agp right?. The agp 6800 GS is vastly slower than the PCIE version which I have. I believe the AGP ones have 350 Mhz core clock while the PCIE has 500-550. Since I have both cards I can confirm that when overclocked to the max (No voltmod) the 7600 GT outruns the GS by a small amount, but what I am talking about is the smoothness of the game play. Due to the 128 bit on the 7600 GT it struggles at higher resolutions (stuttering, random slow downs etc), the 6800 GS is simply smoother. I am just worried the 8600 GTS will suffer form this same problem.

BTW is is very easy to OC a 6800 GS PCI well past 6800 ultra speeds. When voltmodded they can go 8500+ on 3dmark05.
 
I am just worried the 8600 GTS will suffer form this same problem.
There is no point in worrying about such a thing. No one knows yet. Can someone just give me a simple yes or no answer to my question in post #52.
 
There is no point in worrying about such a thing. No one knows yet. Can someone just give me a simple yes or no answer to my question in post #52.

You answered your own question. We are debating on whether the 128-bit will impact the card, and you're questioning two people who have experience with this. I don't understand why you guys don't believe this. I've got a 7600GT volt modded now to 750//1800, and it's not very spectacular, I still get random stuttering like what Cupholder2.0 describes. No matter how high the clocks are, the 128-bit will still bottleneck the card. That is a fact you can't deny, unless you are foolish enough to do so.
 
Its true, Im asking you because you have experience, but you dont give me a satifying answer and I cannot see where I answered myself. I must be to stupid. :)

8600GTS will perform simular to a 7900GT but with a little better graphic and a bit lower power consumption???
Yes or no please.. :D

yes or no, yes or no, yes or no, yes or no.
 
Its true, Im asking you because you have experience, but you dont give me a satifying answer and I cannot see where I answered myself. I must be to stupid. :)

8600GTS will perform simular to a 7900GT but with a little better graphic and a bit lower power consumption???
Yes or no please.. :D

yes or no, yes or no, yes or no, yes or no.

depends on your resolution.
 
The temperature next week will be higher than today--yes or no, yes or no, yes or no?

Prediction of unknown factors is a fool's game--rough estimation is the best anyone can achieve. Wait to see benchmarks when the hardware ships, or if you prefer, place absolute faith in the relevance (not to mention accuracy) of the leaked benchmarks. No one here can offer you any more than that.
 
hmmm
This 8600GTS would be the ideal card for my HTPC... ( currently have an onboard X1250...sucks...)
Low power consumtion, DX10, PureVideo HD, HDMI, possibly passive cooling solution.

But as my HTPC is currently my only rig in the house I have to ask this:
Can i assume a 8600GTS would do just fine with current/ nextgen titles @ 1280x1024 Maxed out and low AA/AF ( don't care so much for AA/AF...comming from a PCI-e 6800NU i never used it anyway. ).Would the 512Mb version be usefull ? Or 'could' it be restricted by the 128bit memory bandwith? Games i would play are Oblivion with Quarls texturepack 3 and Stalker.

Otherwise i have to look into fitting a 8800GTs 320 ( only 320Mb then, though :( ) into my Antec Fusion..

thank you

Btw I still will wait for actual benches @ 17/04 with matured drivers etc... I just want the thought of what to expect :)
 
Well dang, that makes me unhappy. Was hoping for higher scores because I am going to be playing games at 1680x1050 and really looking towards getting that card. But now it seems I am going to have to get 8900GTX(Or whatever the fuck you all think it is goin' to be called)
 
Its true, Im asking you because you have experience, but you dont give me a satifying answer and I cannot see where I answered myself. I must be to stupid. :)

8600GTS will perform simular to a 7900GT but with a little better graphic and a bit lower power consumption???
Yes or no please.. :D

yes or no, yes or no, yes or no, yes or no.

You start pushing resolutions up and that 8600GTS will start to hurt. If it's equal to a 7900GT in terms of performance, at 1280x1024+, the 7900GT will start overtaking you. You may get random spurts of higher FPS, but the 7900GT will be smoother

also, you said none of know for sure, we only go under assumptions, thus you answered your own question if you wanted an accurate answer.
 
I would say the 8600 GTS will be closer to the 7950 GT in terms of performance. Of course this is only based on some 06 scores I have seen.
 
Sound about right. Mid-range cards should be about on the same performance level of of the high end cards of the previous generation.

Not quite on par, I'm getting 8000 3D06 marks in Vista with my previous generation 7950GX2 :)
 
The temperature next week will be higher than today--yes or no, yes or no, yes or no?

Prediction of unknown factors is a fool's game--rough estimation is the best anyone can achieve. Wait to see benchmarks when the hardware ships, or if you prefer, place absolute faith in the relevance (not to mention accuracy) of the leaked benchmarks. No one here can offer you any more than that.

With some experience and education you tend to find that predicitng things becomes more accurate, we predict the weather on a daily basis, I wouldn't call that a fools game at all.

You might not be able to guage an entirely accurate view on the performance of the hardware, but you don't need it exact, a rough idea of relative performance to other hardware is enough.
 
With some experience and education you tend to find that predicitng things becomes more accurate, we predict the weather on a daily basis, I wouldn't call that a fools game at all.

You might not be able to guage an entirely accurate view on the performance of the hardware, but you don't need it exact, a rough idea of relative performance to other hardware is enough.

We predict the weather on a daily basis and we are wrong roughly half the time, even with satellite tracking and computer models. Don't believe me, try matching the forecast to the actual results every day for a week or so and see how well they do. You'll be surprised.

As I said in my own post, "a rough idea of relative performance" is possible--you and I are in agreement there. Tommten had already been given that, and he was insisting on more specifics, which is impossible.

For those who wish to expand their minds further on the subject of prediction vs. human fallibility, I recommend the following presentation by Michael Crichton, Harvard-educated scientist and author of Jurrasic Park, etc.:

http://www.crichton-official.com/speeches/speeches_quote07.html

I also highly recommend the other articles available on his site.
 
The temperature next week will be higher than today--yes or no, yes or no, yes or no?

Prediction of unknown factors is a fool's game--rough estimation is the best anyone can achieve. Wait to see benchmarks when the hardware ships, or if you prefer, place absolute faith in the relevance (not to mention accuracy) of the leaked benchmarks. No one here can offer you any more than that.

http://www.xtremesystems.org/forums/showpost.php?p=2086617&postcount=29

You know, cooleraler is one of the most respected guys on xs and he gets a lot of ES stuff before it hits the market.
 
We predict the weather on a daily basis and we are wrong roughly half the time, even with satellite tracking and computer models. Don't believe me, try matching the forecast to the actual results every day for a week or so and see how well they do. You'll be surprised.

With only 50% accuracy, I believe we should fire all of the weathermen as they are all liars half the time and can't predict anything! Get rid of them. This quote has enlightened me and I must now start a petition to get rid of all of these blatent liars. In fact, let's shut down the inquirer, MSNBC, CBS, and Fox along with the weathermen. :p lol

We can say that the temperature diodes are reading incorrectly. It is true that these devices have tolerances, or the production yield would be vastly reduced due to the quality of which they would have to meet, which would be near perfect. There's plenty more BS where that came from and you can keep questioning over and over and over again the validity of everything... but from what we've seen so far, we can make a pretty good assumption of what's to come. 128-bit cards, we've seen em in action against 256-bit and 320-bit, so we have enough information to make a good assumption as to the card's resolution performance. From the 7600GT//6800GS comparison, we can see the affects of higher clocks with reduced bandwidth. We have tried unified shaders with 8800s, so we can predict a little bit of performance gain from changing architectures. From the thought that Nvidia will release this GPU as mid-range//high end, we can place it next to around a 7900GS - 7950GT performance rating. After looking at cooleralers testing and overclocking, I'd say that was about right.

You're looking at this in an extremely pessimistic view. Glass is half empty, right?
 
With only 50% accuracy, I believe we should fire all of the weathermen as they are all liars half the time and can't predict anything! Get rid of them. This quote has enlightened me and I must now start a petition to get rid of all of these blatent liars. In fact, let's shut down the inquirer, MSNBC, CBS, and Fox along with the weathermen. :p lol

We can say that the temperature diodes are reading incorrectly. It is true that these devices have tolerances, or the production yield would be vastly reduced due to the quality of which they would have to meet, which would be near perfect. There's plenty more BS where that came from and you can keep questioning over and over and over again the validity of everything... but from what we've seen so far, we can make a pretty good assumption of what's to come. 128-bit cards, we've seen em in action against 256-bit and 320-bit, so we have enough information to make a good assumption as to the card's resolution performance. From the 7600GT//6800GS comparison, we can see the affects of higher clocks with reduced bandwidth. We have tried unified shaders with 8800s, so we can predict a little bit of performance gain from changing architectures. From the thought that Nvidia will release this GPU as mid-range//high end, we can place it next to around a 7900GS - 7950GT performance rating. After looking at cooleralers testing and overclocking, I'd say that was about right.

You're looking at this in an extremely pessimistic view. Glass is half empty, right?

Once again, I said "rough estimation is the best anyone can achieve." That doesn't mean I think rough estimation is worthless. I just think we need to be realistic about its limits--realistic, not pessimistic.

See the post by Tommten that I was speaking to in my post--he wanted someone to give him a specific answer on exactly which 79XX cards the 8600GTS will outperform, and under what scenarios. That's impossible, and he good-naturedly admitted as much in his reply to my post.

Re: your comments about MSNBC, CBS, and FOX, try reading the article I linked with your mind open and your thinking cap on, then you can LOL.
 
I mentioned your system because you do not seem to have first hand experience between using 128-bit bus cards and 256-bit. Unless you can tell me first hand how 128-bit cards handle in normal games compared to cards with larger bandwidths, I don't see how you can even be arguing this. A 7600GT has high clocks, but being limited by 128-bit still hurts the performance by a decent amount, not just the small amount you may be referring to.

Ah yes, it is a MIR, and the price before it is 280 - 300, as you have noted. That is correct. I will not try to hide the fact. But when I look for graphic cards, personally, I am willing to fork out more dollars if I know that there is more potential in a card, and that it is not hindered by silly bottlenecks such as the 128-bit bus. But according to your argument, when nvidia releases the rest of the 8 series on April 17th (or whenever they are going to release it), it's a pretty good bet that the 8800GTS prices will drop as well. So I guess we'll just have to wait, huh? :)

Also, you must remember the X1600 fiasco. The card benched really well and beat a 7600GT in 3dmark05, but when set under real world conditions, the 7600GT slaughtered the X1600 in every other benchmark. That 128-bit bus may bench well, but let's see the real world performance. I am already feeling the affects with my 7600GT. Scores around 7k on 3dmark 05, but I still get intermittent lag and I have to run lower resolutions than if I used a 6800GS, which can run the same settings at a higher resolution with smoother gameplay. The 7950GT will have a larger memory bandwidth, allowing you to play at higher resolutions while maintaining the same settings, and having a decent amount of pipes. The 8600GTS will have higher clocks, but be held back by the 128-bit bus, which is detrimental to anyone who wants to play at higher resolutions.

By the way, your MSRP style of trying to determine older card's price to performance ratios are off. You can purchase 7950GT's for around 220 without rebates or anything. eVGA still has their 6800GS rated at an MSRP of around 230, but I can find them for around 100 - 130 dollars. Close to MSRP, right?

First hand is experience by you cannot be considered unless you can find a site online that can verify you results. I do not see any thing that supports your data so far. By the time, the memory bandwidth issue comes into play, you run into scenarios where you don't get playable settings, or your out of the design resolutions for this card to begin with. The 128Bit Interface has some limiting impact, but only enough to keep the card from touching the high end of this generation. I don't allow the use of information unless both of us have direct access to it (such as an independent website), and neither of us have any control of it.

You could potentially test the impact of the extra memory bandwidth alone if you could downclock something like a 7900 GS to 336 MHZ or so. In typical situations though the 7600 GT isn't hindered significantly if at all by the 128Bit Bus. It's is not realistic to expect this card to perform supreme well in higher resolutions as memory bandwidth starts to present a more significant impact.

The X1600 Series was a much different issue, it was TMU limited rather then Shader power limited. 128Bit Interface wasn't much of the issue there. They should have created a card that was 1/2 the R520 not 1/4 the R580 as the RV530 was. That was what killed it basically. The RV560 which is basically 1/2 the R580 and has great performance compared to the 7600 GT, but came out IMO far too late.

In older games where your not Shader Limited, the scenario you have presented may occur and you get higher performance in higher resolutions where bandwidth does come into play, but considering current times, the 8600 GTS is designed to run newer games at mainstream settings at higher performance vs a somewhat diminished performance level possible at higher resolutions. But of course the thinking here is that the 7950 GT/8600 GTS both won't get sufficient FPS for comfortable gameplay so it's a relative non-issue.

You also have to keep in mind there is more to the comparison between the 7950GT vs 8600 GTS then just the bandwidth issue, you still have to contend with the inferior AF implementation of the Geforce 7 Series.

My MSRP style is just fine thank you, :) the 7600 GT vs 6800 GS comparison are both normalized market style pricing so they can indeed be directly compared. You also have to keep in mind that while a 7950GT has fallen to ~230 or so, the 8600 GTS is going to be introduced at $199USD, which means it has potential to further fall in price then the 7950 GT does due to reduced production costs on Nivdia's side. The 8600 GTS won't match the 8800 GTS 320 but that is expected, going by how far they score is away from the 8800 GTS, I believe this core only has 48 Unified Shader Processors to keep costs down given it's clock rates and given performance levels. I would be surprised if it indeed had 64 Shader Units. As well I don't expect the 8600 launch to have much impact on the 8800 Series as they are targeting different segments.
 
So a 8600GTS will perform simular to a 7900GT but with a little better graphic and a bit lower power consumption?
Is that a correct conclusion?

No, as the TDP target for the 8600 GTS is larger then that of the performance mainstream G71 Series.

The 8600 GTS will have higher power consumption then the 7900 GT as DX10 functionality places alot of additional core logic on the die and the 80nm process cannot completely negate the power consumption increase, but should outperform it overall, except where memiry bandwidth issues potentially come into play. Were talking 7950GT performance at standard resolutions as long as the load on the memory subsystem isn't too heavily saturated.

As someone has pointed out in this thread, it would also depend on the settings, that your using the card at, as well as the game.
 
Not quite on par, I'm getting 8000 3D06 marks in Vista with my previous generation 7950GX2 :)

I don't think any one is expecting the mid range Geforce 8 to perform at that level, that is far far too close compared to the 8800 GTS 320.
 
I don't think any one is expecting the mid range Geforce 8 to perform at that level, that is far far too close compared to the 8800 GTS 320.

Yeah, I think you have to take the GX2 as a special case, especially where synthetic benchmarks are concerned. If the rumored 8600GTS score of 7000 is close to the regular 7900GTX (don't remember where it falls), that in itself qualifies as quite a generational leap IMHO.
 
Back
Top