MSI High-End Video Card Comparo @ [H] Enthusiast

Status
Not open for further replies.
Tweakin said:
Great review, very in-depth. Sadly it makes me want to upgrade again and grab a wide screen monitor... :)

My only complaint (not really with the review) is the HDR topics hit in the Oblivion section of the review. I like the HDR side-by-side shots because they really hit home for my how much I don't like HDR in Oblivion. Am I really the only person? I can walk outside today, it's super bright and sunny, find a large rock in the yard and look at it. I am reasonably certain that rock will not be blinding to look at! For the limited good HDR seems to do for backgrounds in Oblivion, it absolutely destroys any type of hard surface for me. It's entirely personal preference though, I heard similar things about HDR in lost coast, but I really liked it.
Nice to know that I'm not the only one! I think they really need to do some tweaking with HDR yet. It brings out the colors by oversaturating them, and it blows out light sources to the point that even ambient light off rough rock is glaring. That's all fine if you want that effect for stylistic purposes, but not if the intent is greater realism (except in special cases like fog, misty nights, player has a bad hangover, or cataracts, ya know.)
 
Brent_Justice said:
FYI it took me 3 weeks to complete that evaluation :eek:


1680x1050 plz, id say more people have 20.1" or 20" monitors than 24"+ monitors so that would be a more common widescreen res. :)
 
Chris_B said:
1680x1050 plz, id say more people have 20.1" or 20" monitors than 24"+ monitors so that would be a more common widescreen res. :)

Honestly there isn't going to be much perf difference between 1680x1050 and 1600x1200, in fact 1600x1200 has more pixels than 1680x1050 and thus is the more intense resolution. So if it is playable at 1600x1200 it will be at 1680x1050 as well.
 
LittleMe said:
I'm a little upset that there was not a mention of the 7900 in the conclusion. The x1900 xtx is stated as the "absolute best gaming performance" card out today but you're comparing ati's x1800 refresh to the none refreshed 7800 series. At the least there should be some mention of the 7900 to make it a fair review. In no way am I saying the 7900 is going to blow the x1900 out of the water or that it's even faster, but you're not comparing apples to apples.


Would you like to buy Brent a 7900GTX for the review? It's not like they can go out and plunk down $500-600 dollars for a flagship GPU so you can see how it compares in couple of benchmarks!

If you want to see a 7900 series represented, send them yours so they can benchmark it.
 
I do not wish to sound mean but I am very disappointed that the 7900 GTX was not included in the review. I am in the market for a new video card and the top of the line from both companies should have been included.

If the product was not available the review should have been delayed until the hardware was ready. As it stands, this review is incomplete.
 
Agent_N said:
Would you like to buy Brent a 7900GTX for the review? It's not like they can go out and plunk down $500-600 dollars for a flagship GPU so you can see how it compares in couple of benchmarks!

If you want to see a 7900 series represented, send them yours so they can benchmark it.

That is just silly. The other cards reviewed are not cheap. If you cannot do a full review then hold off until you can.

That would be like reviewing AMD's top CPU against a PIII just because a P4 costs too much.
 
Coolmanluke said:
And what was I wrong about? The XTX and Fear? This is quoted from the MSI evaluation.

This is what you were wrong about
Perhaps they should have waited, but the 7900 doesn't beat the XTX in Oblivion or Fear.

The MSI evaluation did not include a 7900 so you cannot use that article to make claims about the 7900 and Fear. The top link shows the 7900 has the better performance in Fear
 
Agent_N said:
Would you like to buy Brent a 7900GTX for the review? It's not like they can go out and plunk down $500-600 dollars for a flagship GPU so you can see how it compares in couple of benchmarks!

If you want to see a 7900 series represented, send them yours so they can benchmark it.

I'm sorry, I find it hard to believe they bought the cards they reviewed. Brent, please correct me if I'm wrong, but since they are all MSI cards, I'd see it as MSI sending them the cards. And besides Agent_N, my 3 7900 GTX's are evga, not MSI, so they wouldn't fit in with the MSI theme in this review.

Brent, I think overall, it was a great review, but I still believe it would have been better to have waited to include the 7900GTX.
 
Brent_Justice said:
Honestly there isn't going to be much perf difference between 1680x1050 and 1600x1200, in fact 1600x1200 has more pixels than 1680x1050 and thus is the more intense resolution. So if it is playable at 1600x1200 it will be at 1680x1050 as well.


You're just saying that because you're a slacker ;)
 
I just read through the review, and it was very good overall. However, just 1 thing bugged me. The self-shadows in Oblivion are a game engine bug, are they not? I'm pretty sure I've seen this problem on all cards, wether ATI or Nvidia. I understand that you probably didn't notice this on the 7800GTX because you didn't have soft shadows on, but could you confirm this and if it appears on both cards, modify your review so it doesn't appear specifically as an "ATI issue"?
 
Thanks for the review Brent

Nice to see a big review out there confirming the fact that Oblivion absolutely batters your graphics card. It was amusing me to see some people (I say people, I mean bullshit artists) saying they were getting 60-80fps in exterior locations at 12*10 with settings on max on a 7900GTX

As an owner of a 7900GTX (690/1750) and self confessed Oblivion addict I can confirm that you have no hope of that. I get 30fps most of the time and it often drops to 20 and having seen a friends XTX his PC does just run it better :(

WTT 7900GTX
 
rinaldo00 said:
This is what you were wrong about


The MSI evaluation did not include a 7900 so you cannot use that article to make claims about the 7900 and Fear. The top link shows the 7900 has the better performance in Fear

Anyone can find a benchmark or two showing the GTX beating the XTX in Fear. It took me all of two minutes to find this firingsquad review.

http://www.firingsquad.com/hardware/nvidia_geforce_7900_gt_gtx_performance/page14.asp

The overal consensus in most of the reviews is that the XTX outperforms the GTX in Fear. It's just one game, deal with it.
 
Ardrid said:


Did you read the links you posted? I'll summarize for you:

HardOCP results: XTX had HQ AF enabled while GTX only had plain AF enabled.
Anandtech: In every resolution above 1280x1024 the GTX loses. Not to mention they didn't even use any AF.
X-bit Labs: Same as anandtech the GTX loses at every resolution above 1280x1024. In fact Xbit's results show that 7900 GTX SLI gets slaughtered when AA goes above 4x.

Here's a few more reviews to add to the mix:
http://www.behardware.com/articles/612-3/nvidia-geforce-7900-gtx-gt.html
http://www.bit-tech.net/hardware/2006/03/16/asus_bfg_msi_geforce_7900_gtx_roundup/10.html (uses a standard GTX and more expensive OC'd version)
http://firingsquad.com/hardware/sapphire_blizzard_radeon_x1900_xtx/page10.asp
 
Since this thread has been thoroughly derailed from it initial attempt at discussing an incomplete (granted it was incomplete due to the sponsors lack of a up to date product line) at best comparison of MSI video cards, allow me to offer this; when comparing the 7900 GTX and the X1900 XTX the results will vary due to drivers and patches being released so often. As such, " Red vs. Green" results will flip-flop and each camp will claim victory. Both these cards are top end and capable of really amazing graphics and frame rates. That being said the end result is dual cards = fastest/best possible performance right? This, I believe this to be a one sided event. Let me back out of the way as I submit this next link as it may create a “Link-a-lanche” Watch your toes...



http://www.gamespot.com/features/6145814/index.html

<Point Green> ;)
 
T3ch said:
Since this thread has been thoroughly derailed from it initial attempt at discussing an incomplete (granted it was incomplete due to the sponsors lack of a up to date product line) at best comparison of MSI video cards, allow me to offer this; when comparing the 7900 GTX and the X1900 XTX the results will vary due to drivers and patches being released so often. As such, " Red vs. Green" results will flip-flop and each camp will claim victory. Both these cards are top end and capable of really amazing graphics and frame rates. That being said the end result is dual cards = fastest/best possible performance right? This, I believe this to be a one sided event. Let me back out of the way as I submit this next link as it may create a “Link-a-lanche” Watch your toes...



http://www.gamespot.com/features/6145814/index.html

<Point Green> ;)


Nah it's Gamespot, who reads their website for hardware reviews? If you want to gauge SLI/Crossfire comparison, go check out X-Bit's reviews, particularly when AA is raised above 4X. nVidia performance tanks while Crossfire continues to hum along at high framerates. If I were to go multicard, I'd at least want 6xAA or higher so for that reason, SLI is useless.
 
7900GTX barely leads in Oblivion. In fact the margin in which it leads is so infinitesimal samll I don't know why you guys even bother making a fuss about it. The 7900GTX leads by a smaller margin than the 7800GTX trails behind the X1900XTX.

Quit your bickering and whining and go play Oblivion!

http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html

Edit: quote from link above: "Note: Radeon X1900 XTX provides a much higher minimum performance rate than GeForce 7900 GTX."
 
5150Joker said:
Nah it's Gamespot, who reads their website for hardware reviews? If you want to gauge SLI/Crossfire comparison, go check out X-Bit's reviews, particularly when AA is raised above 4X. nVidia performance tanks while Crossfire continues to hum along at high framerates. If I were to go multicard, I'd at least want 6xAA or higher so for that reason, SLI is useless.

What good is 6xAA if the quality isn't there? ATI may have the superior AF ATM, but Nvidia's AA is superior in comparison. Maybe that's why Nvidia tanks with high AA? Because it does it better? Or completely. I gots the feeling ATI is taking considerable shortcuts to keep the frames up. I have seen quite a few people complain about distance AA on ATI hardware. I wish I had the cash to do a side by side comparo as I would really like to investigate this for myself. Not being argumentative here, but I just wanted to insert this particular variable. Xbit says there doesn't seem to be any IQ differences between the two, but I wonder how deep they are looking. I remember comparing a screenshot in FarCry with another member over at AT who used a X1800XL and another using a X800XL. We werent' going for speed here, just IQ. We all maxxed our AA and AF and the X800XL was inferior to the X1800XL of course, but neither was superior the 7800GTX shot.

I think we were disputing a review that shown 7800GTX vs. X1800XT Image quality. And the GTX looked HORRIBLE in the reviews screenie. One guy was saying how crappy the GTX looked, so I took a screenie of my own on the same point in FarCry and it looked phenomenal. Better than the ATI card...
 
Terra said:
Give it a rest will you?
If HardOPC does a review and nVida looks good, the reds will scream:
"HardOCP is NVIDIA biased.!"
If HardOCP does a review and ATI looks good, the greens will scream:
"HardOCP is ATI biased.

Since they can't be both, I think they evalute on the facts, but that will always leave one "side" feeling left out...

Terra - The review is fine...move along....
Damn you Terra, why did you go and post something I agree with? ;)

Fact is the review was good, even if ATI shined like a star.
 
ATI may have the superior AF ATM, but Nvidia's AA is superior in comparison

wow thats the first time anyones said that... got a link showing NV's AA beating out ATi's? if you are going to meassure by Alpha's including, well that has always been a split road with people liking AAA and others like TR, to tell you the truth i haven't see a spit of difference between each others MSAA methods, and even AAA or TR doesn't get rid of all the jaggies
 
R1ckCa1n said:
Damn you Terra, why did you go and post something I agree with? ;)

Sorry, I will try and keep that to a minium ;)

Terra - Wouldn't want to give you a "meltdown" ;) *L*
 
keysplayr said:
What good is 6xAA if the quality isn't there? ATI may have the superior AF ATM, but Nvidia's AA is superior in comparison. Maybe that's why Nvidia tanks with high AA? Because it does it better? Or completely. I gots the feeling ATI is taking considerable shortcuts to keep the frames up. I have seen quite a few people complain about distance AA on ATI hardware. I wish I had the cash to do a side by side comparo as I would really like to investigate this for myself. Not being argumentative here, but I just wanted to insert this particular variable. Xbit says there doesn't seem to be any IQ differences between the two, but I wonder how deep they are looking. I remember comparing a screenshot in FarCry with another member over at AT who used a X1800XL and another using a X800XL. We werent' going for speed here, just IQ. We all maxxed our AA and AF and the X800XL was inferior to the X1800XL of course, but neither was superior the 7800GTX shot.

I think we were disputing a review that shown 7800GTX vs. X1800XT Image quality. And the GTX looked HORRIBLE in the reviews screenie. One guy was saying how crappy the GTX looked, so I took a screenie of my own on the same point in FarCry and it looked phenomenal. Better than the ATI card...


Do you have any credible proof that nVidia's AA is better than ATi's? The only time nVIdia has better AA is when you are using 8xSS. When comparing Super 14xAA to nVidia's 16xAA mode (both use supersampling) they look about equivalent although hothardware had this to say:
If you compare the image quality of NVIDIA's SLI AA modes to ATI's Super-AA modes, you'll see that they each produce very similar images. Overall though, we'd give an edge to ATI, but it's a darn close call.

Link: http://www.hothardware.com/viewarticle.aspx


There will always be subtle differences between how each card renders a scene and one may look better on the other and vice versa. Overall though there isn't much of a visible distinction between nVidia and ATi when it comes to AA. However, the key difference is that nVidia performance tanks under high resolution with SLI 16xAA while ATi's at 14xAA does not:

http://www.xbitlabs.com/articles/video/display/geforce7900gtx_16.html
http://www.xbitlabs.com/articles/video/display/geforce7900gtx_17.html

Obviously the nVidia cards are doing more work but the performance penalty is huge in some cases.
 
LittleMe said:
There's a few % points improvement there, enough to make them on par. Besides, hasn't nvidia stopped making 7800 cores and don't the 7900's cost less than what they replaced? It would have just been a more balanced review if they'd have included the 7900 series in the review or atleast even mentioned it in the conclusion.

I agree, reviews here have been not as good as history would lead us to expect. This is the site that without hesitation reccomemded the 7900gtx over the x1900xtx - so if their opinion is changing - lets see a return of head to head.
 
My past experience w/ MSI is average at best because they only offer 1 yr. warranty.

How come their Nvidia looks just like XFX one?
 
LittleMe said:
I'm a little upset that there was not a mention of the 7900 in the conclusion. The x1900 xtx is stated as the "absolute best gaming performance" card out today but you're comparing ati's x1800 refresh to the none refreshed 7800 series. At the least there should be some mention of the 7900 to make it a fair review. In no way am I saying the 7900 is going to blow the x1900 out of the water or that it's even faster, but you're not comparing apples to apples.

Firing Squad did an Oblivion review with high-end cards, including the 7900GTX. The bottom line came down to the 7900GTX being faster at 1024x768 and lower resolutions with little or no AA/AF, and HDR, but as soon as the eye candy was on the X1900 series cards took the lead. In fact, there were instances of the X1800XT beating the 7900GTX when the resolution broke into the 1280x102+ resolution range and they eye candy was on. As such, I would say the [H] review was still quite accurate.

http://www.firingsquad.com/hardware/oblivion_high-end_performance/page2.asp
 
Happy Hopping said:
My past experience w/ MSI is average at best because they only offer 1 yr. warranty.

How come their Nvidia looks just like XFX one?

All the cards are made by nvidia, they just flash a different bios onto them to make speed changes and such.
 
Un4given said:
Firing Squad did an Oblivion review with high-end cards, including the 7900GTX. The bottom line came down to the 7900GTX being faster at 1024x768 and lower resolutions with little or no AA/AF, and HDR, but as soon as the eye candy was on the X1900 series cards took the lead. In fact, there were instances of the X1800XT beating the 7900GTX when the resolution broke into the 1280x102+ resolution range and they eye candy was on. As such, I would say the [H] review was still quite accurate.

http://www.firingsquad.com/hardware/oblivion_high-end_performance/page2.asp

I'm not bothered in the least by the 7900 getting put down by the x1900, I just believe it would have been more balanced to present everything.
 
FrizzleFried said:
...fanATIc comment alert...

How is that a fanATIc comment? It's the truth? Brutal, honest truth. a fanATIc comment would have been something like..

OMG ATI OWNERS nVIDIA's Boxers KK!

The x1900XTX is the better card. It does have it's drawbacks though, it's louder and uses a tad more power. But for all of the benefits it brings over the 7x00series... that doesn't bother most people.

Those buying the 7900 GTX are either doing so because they're nVIDIA fans and won't try anything that's not nVIDIA (!!!!!!s) or don't like the loud stock cooler and feel uneasy changing to an aftermarket cooling solution.

In the areas of drivers, features and performance... the x1900 series wins. Now of course this is in teh uber highend market. Where things get a tad confusing is the high end $300 market where the 7900GT and x1800XT are currently battling it out. Each have there own highs and lows. The x1900GT should help make this a more interesting battle at the $300 mark though. Only time will tell.
 
LittleMe said:
All the cards are made by nvidia, they just flash a different bios onto them to make speed changes and such.

Erhm? :confused:
NVIDIA dosn't makes cards?
They design the GPU core and the PCB reference design, but they don't build anything?

Terra...
 
Terra said:
Erhm? :confused:
NVIDIA dosn't makes cards?
They design the GPU core and the PCB reference design, but they don't build anything?

Terra...

Ok, let me correct, they're all nvidia reference design, aka, nvidia doesn't physically produce the card but they make the reference design which almost everybody uses, thus, technically, nvidia makes the cards.
Oh, and Terra, I fell left out, where's the quick whitted response in the bottom of that post?
 
LittleMe said:
Ok, let me correct, they're all nvidia reference design, aka, nvidia doesn't physically produce the card but they make the reference design which almost everybody uses, thus, technically, nvidia makes the cards.
Oh, and Terra, I fell left out, where's the quick whitted response in the bottom of that post?

If I made a bike, and you copied the design but built it yourself, did I still technically build your bike? No. Nvidia does not manufacture cards for massbuying.
 
LittleMe said:
Ok, let me correct, they're all nvidia reference design, aka, nvidia doesn't physically produce the card but they make the reference design which almost everybody uses, thus, technically, nvidia makes the cards.

Again, they desing the refernce PBC, they don't produce any cards.
Is that so hard to understand?
What about my card?
7800GS Bliss 512.
Point me to the reference design that NVIDA made for this card?!
Or are you statment that NVIDA and not Gainward made my card? :rolleyes:

Oh, and Terra, I fell left out, where's the quick whitted response in the bottom of that post?

Cookie?

Terra...
 
Terra said:
Again, they desing the refernce PBC, they don't produce any cards.
Is that so hard to understand?
What about my card?
7800GS Bliss 512.
Point me to the reference design that NVIDA made for this card?!
Or are you statment that NVIDA and not Gainward made my card? :rolleyes:



Cookie?

Terra...

I'm sorry if I'm wrong, but I could have sworn I read somewhere that Nvidia was having the cards produced to control quality. That's why company's like BFG put those stickers on the anti-static bag to reseal it because the card came sealed from somewhere else. If I'm wrong, I'm wrong and apologize.
 
Un4given said:
Firing Squad did an Oblivion review with high-end cards, including the 7900GTX. The bottom line came down to the 7900GTX being faster at 1024x768 and lower resolutions with little or no AA/AF, and HDR, but as soon as the eye candy was on the X1900 series cards took the lead. In fact, there were instances of the X1800XT beating the 7900GTX when the resolution broke into the 1280x102+ resolution range and they eye candy was on. As such, I would say the [H] review was still quite accurate.

http://www.firingsquad.com/hardware/oblivion_high-end_performance/page2.asp

The problem is many of us will likely never even play Oblivion as we're not all RPGers.

How the cards play HL2 and Quake4 is much more important to me, for example, as those are the two games I'm playing now.

Some people who post on boards like this are all too eager to use isolated games or benches to make their "points", when in actuality nVidia and ATI cards each have games they beat the other in by good margins, and their image quality is very similar for the most part.
 
5150Joker said:
Do you have any credible proof that nVidia's AA is better than ATi's? The only time nVIdia has better AA is when you are using 8xSS. When comparing Super 14xAA to nVidia's 16xAA mode (both use supersampling) they look about equivalent although hothardware had this to say:

Etc.

5150, no one can take away from the fact that ATIs Super AA with Crossfire is a great achievement.

The problem is that Crossfire itself still has a long way to go, and not many people even want it. (yourself included)

1. Power- very few psus on the planet that can power two X1900s
2. Noise- without aftermarket cooling*, two of those little dustbusters would probably be louder than the 5800U (*voids warranty)
3. Heat- while they exhuast out the back, you can't tell me having two cards dissipating 120w at peak doesn't heat up your case
4. No one has Crossfire motherboards, we're all among the millions who have SLI motherboards. Speaking of Crossfire motherboards- Southbridge=meh
5. Drivers- absolutely no control of render method, defaults to non geometry scaling tiling. Even the ATI faithful at B3d hate this.
6. Dongles- no one likes them.
7. Flashing and render errors- reported here and elsewhere

Dude, you want to say X1900s are arguably better than 7900s, you'll get no argument from me. Each is a good solution with pros and cons, and that is an understandable viewpoint.
Crossfire is nowhere NEAR as good as SLI yet, and having great AA doesn't outweigh the above. Very, very few people have Crossfire- many, many people have high end SLI. There's a reason for that, and a reason you don't own it either when you can easily afford it? ;)
 
Rollo said:
5150, no one can take away from the fact that ATIs Super AA with Crossfire is a great achievement.

The problem is that Crossfire itself still has a long way to go, and not many people even want it. (yourself included)

1. Power- very few psus on the planet that can power two X1900s
2. Noise- without aftermarket cooling*, two of those little dustbusters would probably be louder than the 5800U (*voids warranty)
3. Heat- while they exhuast out the back, you can't tell me having two cards dissipating 120w at peak doesn't heat up your case
4. No one has Crossfire motherboards, we're all among the millions who have SLI motherboards. Speaking of Crossfire motherboards- Southbridge=meh
5. Drivers- absolutely no control of render method, defaults to non geometry scaling tiling. Even the ATI faithful at B3d hate this.
6. Dongles- no one likes them.
7. Flashing and render errors- reported here and elsewhere

Dude, you want to say X1900s are arguably better than 7900s, you'll get no argument from me. Each is a good solution with pros and cons, and that is an understandable viewpoint.
Crossfire is nowhere NEAR as good as SLI yet, and having great AA doesn't outweigh the above. Very, very few people have Crossfire- many, many people have high end SLI. There's a reason for that, and a reason you don't own it either when you can easily afford it? ;)

Crossfire

Although I won't argue against the fact that few have Crossfire. But Many have SLI?? Both are novelty setups. Dual card computers represent a tiny percentage of gaming rigs. Thats the way it should always be in my opinion. Although I prefer gaming on a computer, spending 1,000 + dollars just on video cards alone is dumb considering you can get an x-box for a fraction of the cost.

Oh, and by the way. If it was the other way around. The 7900 offering the best performance in Oblivion, you would go on and on about it like a broken record. :eek:
 
We're all taking this thread too far and are waaay off topic. This thread is suppose to discuss the review of the MSI cards. I my self have gotten way off topic in here, maybe we should stop it before it gets locked.
 
Status
Not open for further replies.
Back
Top