My Gaming Rig....OLD?? Already??

SAW

[H]ard|Gawd
Joined
Nov 13, 2005
Messages
1,467
Out of curiosity, I started looking thru Newegg at the different computer accessories....you know, just browsing around.

I was scoping out the vid cards when I decided to check mine out. To my suprise, its not even offered anymore and cards that are similar are DIRT FREAKIN CHEAP!!!!

So, I started lookin at all the other components I have and again, ALL are just absurbly cheap.

It hasnt even been a full year yet since I built my rig and already, it seems that my computer is on the low end of the spectrum.

I know that with computers, "better and faster" is always around the corner, but DAMN!!! It just seems so quick.

I was able to run Bioshock at full settings without any problems....Aside from games that "require" Dx10, just how long will it really be before I cant use my current rig for gaming? I feel as though Ive jumped computer-ship when land is still in sight.

Is my gaming rig really that "old/bad" already?
 
I'm in the same boat as you. I may upgrade my mobo/cpu/ram next year, but for now my PC will have to do.
 
It will probably be 2009 at the earliest for any games to require DX10. Aside from Crysis (and maybe WIC, if you haven't alredy tried), you probably won't even have any trouble with upcoming titles at acceptable detail levels considering your monitor. Your card came out ~1.5years ago IIRC, I don't think "Already ??" is really appropriate.
 
If it does what you want then why does it matter if it is old? You'll probably have to upgrade it in November if you wish to have all the eyecandy but I would like to think it would still be capable of medium settings.
 
Well, its not so much about the eyecandy...per se.

This is my very first build....ever. I spent a few months gathering the parts, so yeah "already" doesnt quite fit, depending on perspective.

From my perspective, it was just !!!shocking!!! to see, what at the time, was "near" top of the line graphics card. I believe I paid near $350 for it and was damn proud to have it.

Here it is, nearly a year later (give or take) and I can get 7900series for around $99 bucks (give or take).

For lack of a better word....Its a slap in the face....kinda.

A 3700 SanDiego....That was a damn good cpu (not saying it isnt now, just sayin, heh). Now...I cant find a 3700. The only other s939 I can see is a 4000+. So Im thinking...And to top that off, the 4000+ are at around $50-$60 bucks...WTF!!!!! Mann, I paid about $185 for mine....Heh, again.....a slap in the face.

This being my first custom rig, I guess I wasnt quite prepared for just how fast things change in this "game". As I said earlier, Im fully aware computers evolve and get faster, but DAMNIT.....It was just soo sooo quick.....Gahhh

I guess Im in shock!!!
 
you payed 350 for a 7950gt ko? seems like a lot to me. I got a gtx for that price over a year ago on ebay.
 
Well the good news is your not alone so really it's just the norm. OMFG I had to pay a shitload in taxes this year!~ Well not to be an ass but...you see where i'm going.

It sucks balls. You really did get kicked in the nuts. You were at the bottom of the last gen curve. It happens.
 
Never buy the newest card, cards will drop to an acceptable price few months after release, newest GTX or w/e can just show the world how cool you are for couple months until ppl get it with half the price
 
Never buy the newest card, cards will drop to an acceptable price few months after release, newest GTX or w/e can just show the world how cool you are for couple months until ppl get it with half the price

you would not make a good day trader...

When you bought your 7950GT (G71) it was already obsolete, and it really quite loathe that card.

It's an excellent card, to be sure. 24 Pipes, 16 ROPs and 8 Vertex ends, all clocked at a cool 550MHz core.

But for $350, it was unacceptable. It was getting its backside handed to it on a silver pladder by the much more compeditivly priced X1900 XT 256mb (R580), and later, even more so, by the X1950 XT (R580+), which went for $70-$120 less. The X1950 XTs shader count (48, 3 strapped to each of its 16 pipes) means it will stand the test of time. I'm still recomending/selling the X1950s because they're as cheap as the 8600GTS and outperform it by fair margine.

But ATIs offerings just didnt seem to sell! Every person I've was a cantidate for either of those cards went with the 7950GT. I really wish ATI had the same marketing war machine Nvidia did. When they were benchmarked against each other, it was the ol D3D for ATi and Open GL for nV. The difference was always < 5 FPS with the acception of Oblivion (which heavily favoured ATi due to its heavy use of shaders) and Lock On: Modern Air combat (due to emmensly high res textures). HDR took way more of a toll on the G71 then it did on the R580(+), and considering more and more games are making HDR a big part of the game, the abillity to run HDR without any toll was becoming important. To add to that, the G71 cant run 128bit accurate HDR with Antialiasing.
 
i just upgraded my Ati x700 Pro to an XFX 7900gs about 3 months ago because the price of the 7900gs came under $100, and that is the only time i upgrade! haha. It runs everything at native resolution on my 2405fpw, and i get 40 fps in CS:S with everything turned up.

The next upgrade that i will be doing is around xmass ill be adding another gig stick of RAM to give me a total of 2 gigs with my P4 3ghz ov'ed to 3.8ghz.

After that i will not upgrade again, ill just build a new computer.
 
Found my reciepts from Newegg...

I guess I over exaggerated the price I spent on my 7900 (well, it has been a year, bound to forget the exact dollar).

My 7900gt ko was $289...with tax it came to $293.65....Date purchased was 5/13/06

My CPU cost $212 (tax included) plus $.99 for shipping. Date purchased was 4/8/06

Just to clear up a few things :D

I still cant believe it though. I can get a 4000+cpu for about $60 bucks now and $112 for a 7900 series....Its just amazing how quickly the prices go down (well, quick to me anyways).

Im going to say, all said and done, I spent somwhere around $1000 to $1200 on my rig....at the time....Its a damn good rig too, IMO anyways. But now, heh, I can go out and buy all of the exact same components for my rig, make a mirror copy of what I have and probably spend less than $500 for it......

In my eyes....heh, just seems like a slap in the face.....Im not really complaining.....no, really, im not....Im just in total SHOCK!!!!
 
Just a thought, but why not be a year behind all of the hardware/games?

What I mean is instead of buying all of the hardware right now necessary to run Crysis, why not wait a year or a year and a half and then buy the game/hardware for half price?

Hell you could even be two years behind and get everything for dirt cheap.
 
Well, its not so much about the eyecandy...per se.

This is my very first build....ever. I spent a few months gathering the parts, so yeah "already" doesnt quite fit, depending on perspective.

From my perspective, it was just !!!shocking!!! to see, what at the time, was "near" top of the line graphics card. I believe I paid near $350 for it and was damn proud to have it.

Here it is, nearly a year later (give or take) and I can get 7900series for around $99 bucks (give or take).

For lack of a better word....Its a slap in the face....kinda.

A 3700 SanDiego....That was a damn good cpu (not saying it isnt now, just sayin, heh). Now...I cant find a 3700. The only other s939 I can see is a 4000+. So Im thinking...And to top that off, the 4000+ are at around $50-$60 bucks...WTF!!!!! Mann, I paid about $185 for mine....Heh, again.....a slap in the face.

This being my first custom rig, I guess I wasnt quite prepared for just how fast things change in this "game". As I said earlier, Im fully aware computers evolve and get faster, but DAMNIT.....It was just soo sooo quick.....Gahhh

I guess Im in shock!!!

If you stayed up-to-date on when new architures are coming out you won't have this problem.

You see..when you bought this stuff about 1.5-2 years ago you paid a decent price. About 1 year ago Conroe came out and rewrote all performance and benchmarks - thus AMD and Intel went in a price war - hence why everything is so cheap now.

Now, AMD is releasing Barcelona/Phenom and early resutls aren't looking that good ~5% behind C2D at the same clock speed - overall. However, we have yet to see further testing and scaling capabilities of Barcelona/Phenom.

This should lead to futher price drops - because both C2D and Barcelona/Phenom are nearly equal in performance - the only other avenue to compete against is price.

Given the current overall performance, AMD's next platform in Q4 2008 - AM3/Shanghai we could see another 5-10%+ boost from switching to DDR3, 45nm, scaling, and other optimizations (a guess). We won't see a really big boost in performance from AMD until Fusion aka Bull Dozer until 2009.

However, on the other side of things, Intel already has a 5% performance edge over AMD with early results from Penryn/Yorkfield showing another 5% performance gain (not including scaling - which could make this total higher) when released in earily 2008 for desktops. All in all Intell and AMD should be on equal standing on performance in 2008 - just at different times. In 2009 we should see another big performance jump when Intel releases their Nehalem architure.

So in summary, both AMD and Intel have minor revisions coming out in 2008 with minor performance gains, while in 2009 we should see big jumps in performance again from both companies - which will be better? Only time will tell. Overall, prices will drop - most likely until Fusion and Nehalem arrive in which the one with the most performance gain could charge a premium for that increase in performance.
 
17" Acer LCD, SB2 Audigy 5.1,
Gigabyte Pro SLI Mobo, 3700 SanDiego CPU,
eVGA 7900gt ko, 2g of Patriots

A year ago those weren't highend specs so they sure are to be outdated by now.
On a 17" you should be fine for a little while since you shouldn't even be playing at very high resolutions. As far as DX10 is concerned, its not like MS releases a new version every year, it just so happened that a new OS came out. Graphic cards are probably the fastest to get outdated, heck the 8 series have been out for about half a year now and people are already waiting for the 9 series to come out (myself included). When you have games like Crysis that actually take advantage of the hardware, then why not upgrade? :)
 
Never buy the newest card, cards will drop to an acceptable price few months after release, newest GTX or w/e can just show the world how cool you are for couple months until ppl get it with half the price

so why aren't 8800gtx cards going for 250?
 
17" Acer LCD, SB2 Audigy 5.1,
Gigabyte Pro SLI Mobo, 3700 SanDiego CPU,
eVGA 7900gt ko, 2g of Patriots

A year ago those weren't highend specs so they sure are to be outdated by now.
On a 17" you should be fine for a little while since you shouldn't even be playing at very high resolutions. As far as DX10 is concerned, its not like MS releases a new version every year, it just so happened that a new OS came out. Graphic cards are probably the fastest to get outdated, heck the 8 series have been out for about half a year now and people are already waiting for the 9 series to come out (myself included). When you have games like Crysis that actually take advantage of the hardware, then why not upgrade? :)

No, 8800 came out over a year ago, and GL waiting while I hug my card :)
 
Just a thought, but why not be a year behind all of the hardware/games?

What I mean is instead of buying all of the hardware right now necessary to run Crysis, why not wait a year or a year and a half and then buy the game/hardware for half price?

Hell you could even be two years behind and get everything for dirt cheap.

I agree with ya' the one year bit not so much the 2 year comment.I actually try to be behind about 6 months not only do you benifit from price drops but the drivers have matured a bit and you can really pick and choose what hardware you want based on 6 months of run time, not just a review of hardware thats not on the shelf yet.
 
Just a thought, but why not be a year behind all of the hardware/games?

What I mean is instead of buying all of the hardware right now necessary to run Crysis, why not wait a year or a year and a half and then buy the game/hardware for half price?

Hell you could even be two years behind and get everything for dirt cheap.

What's the point if you want to play all of your games at max settings NOW? Personally, I don't want to wait a year or more to play Crysis at max settings when I can go out now and buy the parts that can do that.
 
I believe I have given the wrong impression here, heh.

Im not looking to upgrade or anything....I was just browsing around.

The focus/point of my post, basically, was how shocking it was to me to see how cheap these components got so quick.

Im behind the curve on whats hot and whats the latest greatest thing. Up till a week ago, I rarely had time to come in here and read stuff thats going on. I hate being behind the curve, trying to play catch up. I built my rig to last me a few years before having to upgrade anything. I figure I have at least till 2009 before actually requiring an upgrade.

But, at least now, I know, just because it costs $60 bucks, doesnt means its a POS. Usually anyways, heh.

Well, actually, I was toying with the notion of getting a bigger monitor, those things are wayyyyy cheaper than they were a year ago....Anyone have a good suggestion for a larger montior?
 
At least your rig still has quite some upgrade room left on it - pity the people still buying AGP/Pentium 4 systems last year!

You should be able to easily throw a dual-core Socket 939 chip in that before they finally disappear completely, and a video card upgrade would be easy, too. And those are solid upgrades - it'll be at least another year (or more) before the Athlon64 FX-60/Opteron 185 are actually feeling 'old in the tooth', and we will have PCI-E 16x cards coming out for a while to come.
 
Yea, my system is starting to feel its age, but my wallet is still hurting.

Main reason why I'm starting to shift from PC gaming to console gaming. Still use my PC for CS 1.6 and CS:S, maybe a few other games, but for the most part, I just can't afford to keep upgrading the damn thing.

Guess I'm going back to my roots, back to console gaming *what I grew up on*
 
I'm already hoping for a new flagship card from nvidia in November. I waited on building a system last year until November when the 8800GTX and 680i boards were released. Now the 8800GTX just isn't cutting it... already....and SLI is just too dodgy in Vista-64 to even contemplate. It doesn't even work in a lot of the games that are worth playing.

It's less than a year and the flagship card has been struggling for a few months, especially in the DX10 titles (which look especially good). I don't want to be playing the CoH expansion in DX10 at 1680x1050 come Novemeber on a 30" screen that goes up to 2560x1600.

I agree that it all seems to be moving faster than it was a few years ago.
 
I'm already hoping for a new flagship card from nvidia in November. I waited on building a system last year until November when the 8800GTX and 680i boards were released. Now the 8800GTX just isn't cutting it... already....and SLI is just too dodgy in Vista-64 to even contemplate. It doesn't even work in a lot of the games that are worth playing.

It's less than a year and the flagship card has been struggling for a few months, especially in the DX10 titles (which look especially good). I don't want to be playing the CoH expansion in DX10 at 1680x1050 come Novemeber on a 30" screen that goes up to 2560x1600.

I agree that it all seems to be moving faster than it was a few years ago.

A few years ago, 1600x1200 was considered a high resolution. Now your playing on 2560x1600. Dont really think you can complain that you have to upgrade your graphics card every year. This is why I refuse to get a high resolution monitor. 1280x1024 perfect resolution for my eyes and wallet :D
 
A few years ago, 1600x1200 was considered a high resolution. Now your playing on 2560x1600. Dont really think you can complain that you have to upgrade your graphics card every year. This is why I refuse to get a high resolution monitor. 1280x1024 perfect resolution for my eyes and wallet :D

It's not that I have to upgrade regularly that bothers me. I knew that when I pulled the trigger on a 30" screen and got used to playing at 2560x1600. It's the fact that I have to downgrade the resolution before the next cards are even available. It's also worth noting that consoles are running stuff at 1920x1080 now so the resolution bar has been shoved up significantly in the space of a year or two.
 
Well, actually, I was toying with the notion of getting a bigger monitor, those things are wayyyyy cheaper than they were a year ago....Anyone have a good suggestion for a larger montior?

I know a number of my friends who own a 22" scepter because they got it for cheap, you might want to look at those.
 
A few years ago, 1600x1200 was considered a high resolution. Now your playing on 2560x1600. Dont really think you can complain that you have to upgrade your graphics card every year. This is why I refuse to get a high resolution monitor. 1280x1024 perfect resolution for my eyes and wallet :D

LOL, i have a 21" CRT monitor and run my games at 1280x1024 and see no problem with that. Its actually the best res it seems, my monitor can do 1600x1200, but most OSD gets too small too read and the screen darkens a lil bit.

My system can be had for CHEAP, i didn't build it with all new parts (exception of one GFX card, HDD's, DVD Burner, Case, Power Supplies) But the processor has been slashed in price more than over half in the last year since S939 died, I have no reason to upgrade, this system is pretty good still at everything i need.

The thing is you have to learn to build when items can be had cheap, almost all my PC games are bought 2nd hand for cheap after they've been out for awhile. Since i don't try to always have the newest games, it lets me have awhile playing the older games with cheaper older/used hardware. After 10+ years of PC gaming you learn to build smart, not build new, unless your super rich :rolleyes:

EDIT: I have tried BIOSHOCK and it plays great on the system i'm using.
 
After about 17 years of PC gaming, I'm finally getting sick of it.
I feel similar, but all that's changed for me at least is the need to upgrade comes sooner because the native res of LCD's requires more horsepower. Can't just run at 800x600 or 1024x768 any longer :( The way games look in widescreen is almost worth the trouble though.

Luckily, waiting off on picking up new games until better hardware deals comes along pays off also because games eventually find their way into bargain bins.
 
LoL...I bought my 7800GTX ($450 I thinks) on launch month and my 3700+ at the same time for about $350. It is still a solid machine. I might even overclock it one of these days. :p
 
I feel similar, but all that's changed for me at least is the need to upgrade comes sooner because the native res of LCD's requires more horsepower. Can't just run at 800x600 or 1024x768 any longer :( The way games look in widescreen is almost worth the trouble though.

Luckily, waiting off on picking up new games until better hardware deals comes along pays off also because games eventually find their way into bargain bins.

You know, 1440x900 is widescreen, too. :D

FWIW, I stick to the 'smaller' LCD screens for that reason. I DO want the 'latest and greatest' games to play with all in-game settings maxed at 'native res'...but to do that on a 1680x1050 panel (or worse - I can't imagine what 1920x1200 users must sacrifice to get that res to work!) would require too expensive an upgrade each time a new game comes out.

At 1280x1024, or 1440x900, you only have to buy the 'midrange' parts to crank the game's in-game settings and still run at native res.

...and that's kinda nice.
 
A few years ago, 1600x1200 was considered a high resolution. Now your playing on 2560x1600. Dont really think you can complain that you have to upgrade your graphics card every year. This is why I refuse to get a high resolution monitor. 1280x1024 perfect resolution for my eyes and wallet :D

Thats the highest my monitor will go and like you, heh, Im still droolin over the graphics I see. IMO, when it comes to games, graphics are definately nice to look at, but I didnt buy a game to look at it, I bought it to play...that being said, when Im playing a game and in the heat of the action, im not stopping to count the individual leaves (although I could, heh). I wouldnt buy a bigger monitor to raise resolution, I would buy it for simply, a larger picture. But then, I guess, thats the catch. The larger the screen, the higher the res. you would want to use. Id say a 22" would be plenty for me.
 
It's also worth noting that consoles are running stuff at 1920x1080 now so the resolution bar has been shoved up significantly in the space of a year or two.

About that....for "todays" console gaming, for it to really be worth it, requires you to have one of them fancy smancy tv's that can hang on a wall, heh.

A buddy of mine, rants and raves about his 360 (not saying they bad in the least, lets not go there:eek:), but the guy knows nothing about computers. One of the few people I know who still, does not own one. He plays his games on a huge 51" tv's....you know, the kind that breaks your back getting it up the stairs and its so large and heavy, it sits on the floor, and gawd forbid your trying to watch the thing from a side view.

I keep tellin the guy...."guy" You need a better tv to really appreciate the games you play. He just scoffs at me and says he will own that sucker in 4 more payments. Heh, he went to a rent-to-own store.....

Ummm....nevermind...
 
You know, 1440x900 is widescreen, too. :D

FWIW, I stick to the 'smaller' LCD screens for that reason. I DO want the 'latest and greatest' games to play with all in-game settings maxed at 'native res'...but to do that on a 1680x1050 panel (or worse - I can't imagine what 1920x1200 users must sacrifice to get that res to work!) would require too expensive an upgrade each time a new game comes out.

At 1280x1024, or 1440x900, you only have to buy the 'midrange' parts to crank the game's in-game settings and still run at native res.

...and that's kinda nice.
Most certainly, and it does look all right for being scaled, but 1440x900 still runs BioShock and MOH:A like a flat tire (and I assume the torrent of UE3 based games, I won't even talk about Crysis, it makes my video card cry). To make a noticeable difference, I would need an 8800GTS 320 minimum which is still near its 1Q, pre R600 price. If the 8700GTS holds up well in shader heavy games with its fewer 64 stream processors and drops below $200 for the 512Meg, I may consider it. For now it's HL2 and Far Cry for me.
 
You know, 1440x900 is widescreen, too. :D

FWIW, I stick to the 'smaller' LCD screens for that reason. I DO want the 'latest and greatest' games to play with all in-game settings maxed at 'native res'...but to do that on a 1680x1050 panel (or worse - I can't imagine what 1920x1200 users must sacrifice to get that res to work!) would require too expensive an upgrade each time a new game comes out.

At 1280x1024, or 1440x900, you only have to buy the 'midrange' parts to crank the game's in-game settings and still run at native res.

...and that's kinda nice.

1920x 1200 isn't quite as insane to run as people make out. System is in sig, managed to play all my games at native res with most of the settings cranked high (apart from AA, I can't see the difference).

Sure haven't tried most of the newer games (Bioshock, MOH:Airborne), but ET:QW beta ran fine, VT3, Prey and HL2:EP1 all run at 50-60 fps when cranked full.

Don't think I'm going to be able to play Crysis like this for one minute though :p
 
Just a thought, but why not be a year behind all of the hardware/games?

What I mean is instead of buying all of the hardware right now necessary to run Crysis, why not wait a year or a year and a half and then buy the game/hardware for half price?

Hell you could even be two years behind and get everything for dirt cheap.
best advice going, well said. I was always on the latest and greatest but not since vista + gpu + dx10 ..its totally stopped me from gaming
 
G'ßöö;1031425255 said:
best advice going, well said. I was always on the latest and greatest but not since vista + gpu + dx10 ..its totally stopped me from gaming

You don't even have to go a year and a half....

but this is what I do. I'm still playing HL2 and some older games from that age, I'll be upgrading in November depending on the pricedrops of GPUs to something better along with memory and a mobo etc.

PC gaming is quite a bit cheaper than people realize if you avoid the bleeding edge. I know people with cheapass emachines that play Medal of Honor.... hahaha, for all they seem to care they love it.
 
I know people with cheapass emachines that play Medal of Honor.... hahaha, for all they seem to care they love it.

And that's all that really matters.

Its been said time and time again in this thread that you don't have to have the bleeding edge to be able to play games that are presently in release. You really don't need bleeding edge for next gen unless you want a ridiculously high resolution with all the eye candy. Until my current build, I NEVER turned on the eye candy because I wanted to get as much performance out of my system as I could. I guess that goes back to when I played CS and quake3 competitively and you dumbed down and tweaked the config to the point where the graphics were grainy, and in the case of quake3 your opponents were bright green and pink blotches of squares - LOL. You also played at either 640x480 or 800x600. I've just now moved my resolution up to 1024x768 in CS:S. I've only recently upgraded to the x2 6000 because of the big recent pricedrops, and even then $169 is the most I've ever paid for a processor. Everyone else is screaming about how cheap the quad cores are, heh, I'll wait and buy one when they're around $150. Not like I even utilize half of what my current cpu can handle since I'm still engrained in the single core mindset of being patient and avoiding multitasking. I'll buy an 8800gts or gtx off the forums here when the new series comes out in November and the last gen pricedrops start. My current card plays everything I play fine. No need to waste money and regret it in 3 months when the new cards roll out.
 
OP, i'm running OCd c2d e6300 at 3.15GHz with a Radeon x1800xtpe video card and 2 gigs of RAM and MY system is on the lower end of the higher-end spectrum, relatively speaking........ i've ONLY got 2 cores, that's so 2006/early 2007 :D but guess what? It does everything I want it to (except run DIRT at high-res will full eye candy, better upgrade my vidcard) perfectly fine! And it's faster than running a quad-core at a lower clockspeed, at least at most things............. these things are obsolete before ya even build 'em, it seems, but whatcha gonna do?

EDIT: not that you said you wanted to, but if you at least wanted to go dual-core, you could get this:

http://www.newegg.com/Product/Product.aspx?Item=N82E16819103053

not a bad upgrade for $72.50 shipped

EDIT2: just looked at my order history, my A64 3200+ Clawhammer was $282 on 1/30/04; that was the old s754 chip, the 1st A64 out. My San Diego 3700+ was $267.00 on 10/15/05, and my 4400+ X2 was $497.00 on 11/25/05........ and all of those have gone by the wayside.........
 
Back
Top