LOL! Matrox G450's still in stock!?!? .99 after MIR!

I'm almost embarrassed to say this, but I actually bought 2 of these for my job so i can run 2 monitors on the ancient hardware i have to use here.

Now, i'm the envy of the antique office

QM
 
sethmo said:
"Seems that page you have requested is either out of date or no longer exists.
We want to help you find exactly what you are looking for. Below, you may search our site directly or visit one of our categories, conveniently located on the left side of this page. Thank you for visiting TigerDirect.com."

Dead link? Im interested in seenig this haha :)

[Edit]
Does this link work:
http://www.tigerdirect.com/applications/searchtools/item-Details.asp?EdpNo=1755113&sku=TC3H-1041

You got it.

It was that damn truncate that vBulletin does. I edited and left the "..." where part of the URL was supposed to go. :p
 
quasimodem said:
I'm almost embarrassed to say this, but I actually bought 2 of these for my job so i can run 2 monitors on the ancient hardware i have to use here.

Now, i'm the envy of the antique office

QM

I bought one too :(
 
Oh man!

I'm out of town next week otherwise I would jump on it, if they're still in stock when I get back I'm going to get one.

Bookmarked!
 
annaconda said:
Umm interesting only 16MB you probably cannot even run Windows XP.
These have excellent driver support under XP. I'm running a g250 with 8 megs ram right now in an older workstation. The performance of these is just fine for windows apps, though you quite obviously won't be running any gaming titles on them.
 
I used to have one of these in a F@H rig. Came in handy when my sister's onboard video fritzed out in her comp.
 
annaconda said:
Umm interesting only 16MB you probably cannot even run Windows XP.

I don't see why not. The card can handle resolutions over 1600x1200. The only reason for more memory is for 3D. When it comes to 2D graphics, there's been no need for more memory in years(since this card came out), and I suspect the g450 is one of the best 2D solutions available, because that was(is?) Matrox' focus, while Nvidia and ATI are all about 3D performance.
 
nilepez said:
I don't see why not. The card can handle resolutions over 1600x1200. The only reason for more memory is for 3D. When it comes to 2D graphics, there's been no need for more memory in years(since this card came out), and I suspect the g450 is one of the best 2D solutions available, because that was(is?) Matrox' focus, while Nvidia and ATI are all about 3D performance.
I'd agree with everything you said, except the last part. DVI ports leads to a much higher quality signal than any analog vga port. If you are talking about analog only, then this is probably one of the highest quality solutions available. Otherwise DVI rules.
 
dekard said:
I'd agree with everything you said, except the last part. DVI ports leads to a much higher quality signal than any analog vga port. If you are talking about analog only, then this is probably one of the highest quality solutions available. Otherwise DVI rules.
The difference between DVI and D-sub is negligible. Unless you are comparing high end DVI input monitors with low end low cost D-sub video cards.
 
Richteralan said:
The difference between DVI and D-sub is negligible. Unless you are comparing high end DVI input monitors with low end low cost D-sub video cards.
On any recent lcd with DVI the picture quality improvement from d-sub (analog vga) is very obvious. You must be using some small or old monitors to not have noticed this difference. I'm anything I've looked at 17 and larger you can easily see the difference.
 
dekard said:
On any recent lcd with DVI the picture quality improvement from d-sub (analog vga) is very obvious. You must be using some small or old monitors to not have noticed this difference. I'm anything I've looked at 17 and larger you can easily see the difference.

Most people I know in imaging/2D design/3D rendering don't use LCD's.
 
Nomad said:
What is that power cord doing feeding under the passive heatsink?

It's ground, not power. Keeps the heatsink grounded. The white connector on the card is likely for a variation of this card that had an active heatsink (PWR, GND).

I used to have one of these for a development job I had. At the time, 2 monitors from one card was still a big deal. This one allows 1600x1200 x2 monitors. The old G400 only allowed 1280x1024.

Nice card for 2D.
 
Spectre said:
Most people I know in imaging/2D design/3D rendering don't use LCD's.
Agreed, there are a lot of people using CRT's still in some industries. Having said that, even on an CRT you'd notice all sorts of visual artifacts from pushing a high resolution signal down an analog cable. DVI as a signal interface has better quality to analog d-sub.
 
I am currently running my third monitor on a ATI Rage! 4mb of glory. Too bad these aren't PCI. II could use one to replace this rage, image quality is sh!t
 
dekard said:
Agreed, there are a lot of people using CRT's still in some industries. Having said that, even on an CRT you'd notice all sorts of visual artifacts from pushing a high resolution signal down an analog cable. DVI as a signal interface has better quality to analog d-sub.

I notice no difference between VGA and DVI on my Viewsonic N2750W LCD TV, and it's a 27" screen. I think the DVI to VGA comparison is very overblown for the most part.

All the research you will find will tell you that it is theoretically better to use DVI with an LCD if possible, but you also will not find substantial evidence saying that DVI is better. Converters built into LCD monitors are so good now-a-days, that most people could not tell a VGA from a DVI connection on any particular monitor. The main advantage that DVI has, if any, is the fact that it is a "pure digital signal", so it is far less susceptable to electromagnetic interference.

And that's the facts.
 
So how does this compare to the NVidia TNT2? I think the 3 year old Sony PC needs a new GFX card.
 
NoZoL said:
I notice no difference between VGA and DVI on my Viewsonic N2750W LCD TV.

The main advantage that DVI has, if any, is the fact that it is a "pure digital signal", so it is far less susceptable to electromagnetic interference.

And that's the facts.
1: Your tv has a lower resolution than I'm referring too. I found it painfully obvious at anything more than 1280x1024.... 1600 x 1200 or 1680 x 1050 shows the EMI clearly.

2: Whatever the reason, there's difference in signal quality... DVI = better quality on the desktop for most users.
 
dekard said:
I'd agree with everything you said, except the last part. DVI ports leads to a much higher quality signal than any analog vga port. If you are talking about analog only, then this is probably one of the highest quality solutions available. Otherwise DVI rules.

Anyone that had the 450 and was serious about their video quality didn't use a vga cable. I don't remember what they were called, but it had 5 or 6 seperate connectors that went into the monitor. And it was a noticable difference in quality. Can't comment on DVI, since the only LCD with a wide gamut, AFAIK, is the EIZO, and I'm not spending a few grand on a monitor.
 
nilepez said:
Anyone that had the 450 and was serious about their video quality didn't use a vga cable. I don't remember what they were called, but it had 5 or 6 seperate connectors that went into the monitor. And it was a noticable difference in quality. Can't comment on DVI, since the only LCD with a wide gamut, AFAIK, is the EIZO, and I'm not spending a few grand on a monitor.
isn't that cable related to composite video?
 
Azkarr said:
So how does this compare to the NVidia TNT2? I think the 3 year old Sony PC needs a new GFX card.

For 2d, it's better, but for 3d, I believe that the TNT2 was better, but I don't actually recall which card came first.
 
dekard said:
isn't that cable related to composite video?

That may be correct. My old viewsonic could use it, but when it died, I ended up with a refurbed e540, which only takes vga. Not a great monitor, but it'll work until LCDs with gamuts as wide as CRTs become mainstream...and with a little luck they'll be relatively fast too.
 
Back
Top