Any way to force a GeForce FX to run HL2 in DX9 mode?

Status
Not open for further replies.

dderidex

Supreme [H]ardness
Joined
Oct 31, 2001
Messages
6,328
In "Far Cry" there were parameters you could use, but I don't know of anything for HL2.

From Firingsquad's article:
DirectX 7 path
Half-Life 2’s DirectX 7 path includes NVIDIA’s GeForce 256, GeForce 2 series, GeForce2/4 MX, and nForce, and the RADEON 7xxx series (with RADEON 9100 IGP and MOBILITY RADEON 9000/9100 defaulting to the DirectX 7 path as well).
  • Screen space effects are really simple.
  • No model decals
  • No detail props
  • No refractive water
  • Reduced decal visibility distance
  • No bumpmaps
  • Reduced model LODs
  • Reduced material mip level

DirectX 8 path
Among the cards included in the DirectX 8 path are GeForce4 Titanium series (including GeForce4 Go), GeForce FX 5200/5600/5700, and GeForce FX Go 5600/5700 series.
  • On some cards with poor fillrate, bumpmaps may be turned off in some scenes that use a lot of bumpmaps. At the moment, this is true for the GeForce4 Ti 4200 but we've worked with Nvidia to come up with a solution to reactivate bumpmaps on the 4200.
  • Water by default is refractive but does not have local specular.
  • Water has a hard edge when it meets the shore, volumetric fog is used for this. Its per vertex screen space effects are better than DirectX 7, but still simple.
  • Shadows are render-to-texture but are not supersampled to make them
    look softer.

DirectX 8.1 path
Cards that use the DirectX 8.1 path include RADEON 8500/9100, RADEON 9000/9200, GeForce FX 5800/5900.
  • Water by default is refractive but does not have local reflections. If you activate local reflections on this card, it will occur in one pass as opposed to happening in 2 passes on DX8.0, which will make it cheaper from a fillrate perspective.
  • As in DX8.0, water has a hard edge when it meets the shore, volumetric fog is used for water. Its per vertex screen space effects are better than DX7, but still simple.
  • Shadows are render-to-texture and are supersampled to make them look
    softer.

DirectX 9 path
Graphics cards that use the DX9 path include GeForce 6800 series, GeForce 6600 series, RADEON 9500/9600, RADEON X300/X600 series, RADEON X800 series, RADEON 9700/9800, MOBILITY RADEON 9600/9700, and MOBILITY RADEON 9800
  • Water by default is refractive with local reflections from world geometry.
  • Water refraction realistically refracts the geometry beneath the water (when looking into the water) based on the depth of the geometry in DX9.
  • There is a special water rendering feature which smooths out the shorelines which reduces water refraction in areas with shallow water.
  • There is a gradual blend from water to shore volumetric fog (for water) its per pixel screen space effects (post effects) are more complex.
  • Shadows are render-to-texture and are supersampled to make them look
    softer.
  • Certain displacements use blended bumpmaps instead of a single bumpmaps (for example, displacements that blend between sand and rocks).

I'm running a GeForce FX 5900xt overclocked to 5950 Ultra speeds for the time being, which, according to this list, will be using the DX8.1 path.

Errr....I'd rather not? What can you do to force the DX9 path? What kind of performance penalty would be involved?
 
Anand mentioned doing that in his review today (with very poor results). firingsquad also did that in CS:S. I'm not sure how to do it, but it's possible.

anandtech review said:
To give you a little preview of what is to come, in DirectX 9 mode, the GeForce 5900 Ultra offers about 1/3 of the performance of the slowest card in this test. If you’re unfortunate enough to have purchased a NV3x based graphics card, you’re out of luck with running Half Life 2 using the DX9 codepath (at any reasonable frame rates).
 
ok here ya go... create a shortcut of hl2 on ur desktop... right-click on hl2 in the "playgames" list and select create desktop icon.. under the desktop icon... click on properties and go to target and add -dx 90 at the end and it will force directx 9.0b


But for the outcome.. u WILL loose about 50% performance and it will probably kill ur gaming experience unless u lower the settings or go down a resolution... i do not recommend doing this though..
 
pxc said:
Anand mentioned doing that in his review today (with very poor results). firingsquad also did that in CS:S. I'm not sure how to do it, but it's possible.

That's a little unbelievable somehow.

I mean, the 5950 Ultras do *so well* in Doom3, and are certainly quite excellent in...well, hell, virtually everything else out there right now. I mean, in Doom3 they handle 'High Quality' settings with 1024x768, 4xFSAA, and 8xAniso and still keep the average over 30fps.

And 3dMark03 (while everyone proclaims it's not a "valid benchmark") was certainly the heaviest use of shaders abusing the GeForce FX line to date, yet my 5900xt @ 5950 is pushing 7k in it.

Hard to believe a card that can do both those things can't manage playable framerates in HL2 using NO anti-aliasing with the DX9 path.
 
dderidex said:
That's a little unbelievable somehow.

I mean, the 5950 Ultras do *so well* in Doom3, and are certainly quite excellent in...well, hell, virtually everything else out there right now.
The difference: Gabe Newell only works at Valve. :p But really, he's been trash talking the nv3x series for a long time.

And anyways CS:S looked fine in DX8 mode and HL2 should look pretty decent, too.
 
aZn_plyR said:
But for the outcome.. u WILL loose about 50% performance and it will probably kill ur gaming experience unless u lower the settings or go down a resolution... i do not recommend doing this though..
Got a benchmark for the performance you are getting there?

I checked Anand's, FiringSquads, and [H]'s reviews (all the ones I could find), and none benchmarked any GeForceFX cards.

FWIW, this is all a thought experiment for me - I do not have Half-Life 2 myself, nor do I plan on getting it. Single-player games just hold no interest to me, I'm quite happy with Planetside for now.

Still, it's a curious thing, and I'd like to see more info on the situation.
 
firingsquad did benchmark CS:S in DX9 mode. I read the review last month. That should give you an idea since it's the same engine. But HL2 is a lot more stressful than CS:S.
 
their might be a way...... but why would you want to do that since you card doe not support dx9.

you will gain no benefit.
 
geekcomputing said:
their might be a way...... but why would you want to do that since you card doe not support dx9.

you will gain no benefit.
Ummm....yeah, the GeForce FX cards *do* most definately support DX9. Heck, I've been playing the Tribes: Vengeance demo with PS2.0 shaders cranked up...little hard to do that on DX8 hardware, no?

And a score of 7k in 3dMark03 is ONLY doable with DX9 hardware - you can't even run the last test without it.

GF-FX cards (at least, the 5900xt and up) handle DX9 *just fine* in every game out there EXCEPT Half-Life 2, apparently.
 
dderidex said:
GF-FX cards (at least, the 5900xt and up) handle DX9 *just fine* in every game out there EXCEPT Half-Life 2, apparently.
LOL Gabe Newell

Yeah, pretty much every other DX9 game at least runs acceptably. I upgraded from a 5900 (non-XT) @ 5950U that I used for about 10 months before I bought a 6800GT and a 6600GT, so I've also played many games on it.
 
are you saying my 5900xt wont play HL2 good even if I have the settings turned down on 1024x768?
 
ratfood said:
are you saying my 5900xt wont play HL2 good even if I have the settings turned down on 1024x768?
It will play fine in the default rendering path (DX8.1).

What anand said is that when it is in DX9 mode, "the GeForce 5900 Ultra offers about 1/3 of the performance of the slowest card in this test." The slowest card was the 9700 Pro, getting between 85 and 121 fps in different benchmarks at 1024x768 with no AA or AF. That implies the 5900 Ultra would run about 35fps in DX9 mode at 1024x768 with AA and AF off.

Whether nvidia can "fix" this remains to be seen. I have a feeling that future drivers will have much better performance in DX9 mode if it's at all possible.
 
The 5XXX cards do good in doom3 doto shader replacement.. this is a fact, cramack said so him self, and in DX9 games they're also no doubt doing shader replacement.
In EQ2, the lower end cards run the game in DX8.1 mode, not sure about the high end stuff, but I know for a fact the 57XX and lower do it dx8.1 mode.
Do you guys really think the NV3X was a decent product?
It's a well known fact nvidia designed the card around DX8.1, with a little DX9, they didn't expect the R300...
 
the fx series was fine.. except for directx 9 bugs like cs:s and hl2... otherwise they were great and IMO beat the 9800 series in directx 8 games but what killed them was the buggy directx9 support.. and besides even if u play in dx8.1.. the only thing you will notice that is different is the water...... but there was a trick.. somewhere that u can run directx9 water in directx8.1 mode.. which had very little performance hit.. but i dont know where :(


o yeah and here are the benchmarks of the 5xxx series running in directx9.0b...


http://firingsquad.com/hardware/geforce_fx_half-life2/

yeah over 50% performance hit most the time .. sadly..
 
NV3x cores cannot do DX9 well. this is not new stuff. Sure, they can run the occasional pixel here and there, but overall its simply worthless for a DX9 title. Just compile your own stuff and test it yourself :|
 
Cali3350 said:
NV3x cores cannot do DX9 well. this is not new stuff. Sure, they can run the occasional pixel here and there, but overall its simply worthless for a DX9 title. Just compile your own stuff and test it yourself :|
Curious how those numbers look in HL2, though.

After all, a 5950 is still showing perfectly playable performance at 1024x768 with 4xAA and 8xAniso.
 
"Ummm....yeah, the GeForce FX cards *do* most definately support DX9"

ill be damned.. they do support dx9. i looked it up on nvidias site. i thought the fx only went to dx8.1

o well.. good call man.
 
aZn_plyR said:
the fx series was fine.. except for directx 9 bugs like cs:s and hl2... otherwise they were great and IMO beat the 9800 series in directx 8 games but what killed them was the buggy directx9 support.. and besides even if u play in dx8.1.. the only thing you will notice that is different is the water...... but there was a trick.. somewhere that u can run directx9 water in directx8.1 mode.. which had very little performance hit.. but i dont know where :(


o yeah and here are the benchmarks of the 5xxx series running in directx9.0b...


http://firingsquad.com/hardware/geforce_fx_half-life2/

yeah over 50% performance hit most the time .. sadly..
Besides the lackluster SM2 performance, the fsaa and filterting was shit.
You'd have to be a nvidia pr wanna be to think the fx series was anything but shit.
If you wan't to play dx8 games, get a dx8 card, if you want to play dx9 games, get a dx9 card.
Simple stuff, but the fx series can't run dx9 stuff at any decent rate.
 
dderidex said:
Got a benchmark for the performance you are getting there?

I checked Anand's, FiringSquads, and [H]'s reviews (all the ones I could find), and none benchmarked any GeForceFX cards.

FWIW, this is all a thought experiment for me - I do not have Half-Life 2 myself, nor do I plan on getting it. Single-player games just hold no interest to me, I'm quite happy with Planetside for now.

Still, it's a curious thing, and I'd like to see more info on the situation.

"single player games" yes I consider CS:S a single player game, as well as HL death match :rolleyes: . ;) :p

~Adam
 
CleanSlate said:
"single player games" yes I consider CS:S a single player game, as well as HL death match :rolleyes: . ;) :p

Hmmm....I could answer:

Actually, so do I! When 90% of the people playing are cheating, you might as well be playing by yourself!

Alternatively, I could go with:

Actually, so do I! When you have less than 400 people in a single battle at a time, you might as well be playing by yourself!

(Sorry, pretty happy with Planetside right now - everything else just seems kinda...boring...in comparison. Plus, 'pay to play' keeps a rather enormous number of cheaters away, since getting caught involves actual money on the line for them.)
 
pxc said:
Prove me wrong, you should be name raped to NV3X lover, since no one in their right mind thinks it's anything about a glorified DX8.1 card.
All the facts point to it- games forcing it to run in 8.1 code, shader replacement, etc.
Name one good thing the NV3X has going for it.
 
dderidex said:
Hmmm....I could answer:

Actually, so do I! When 90% of the people playing are cheating, you might as well be playing by yourself!

Alternatively, I could go with:

Actually, so do I! When you have less than 400 people in a single battle at a time, you might as well be playing by yourself!

(Sorry, pretty happy with Planetside right now - everything else just seems kinda...boring...in comparison. Plus, 'pay to play' keeps a rather enormous number of cheaters away, since getting caught involves actual money on the line for them.)

If you think 90% cheat, then you must be new to CS. I'm beta 5 and I do, at times, get accussed of cheating. Sure I can be bs but I'm not bs all the time.

90% of the players Don't hack, maybe 15% of them do.

~Adam
 
I dont know why you cant believe the 59xx series cards will take a huge hit in HL2. They have been slower in every DX9 game I have seen. Especially in Farcry, with the latest patch that fixes the graphical "bugs". The 9800XT is much, much faster than the 5950U.

CleanSlate said:
90% of the players Don't hack, maybe 15% of them do.

~Adam

Sorry, that was just funny. I think you're 5% off there. :)
 
fallguy said:
I dont know why you cant believe the 59xx series cards will take a huge hit in HL2. They have been slower in every DX9 game I have seen. Especially in Farcry, with the latest patch that fixes the graphical "bugs". The 9800XT is much, much faster than the 5950U.
I never said I didn't expect them to take a huge hit - it's common knowledge the shaders on them aren't EXACTLY 'state of the art'. Still, to get 7K in 3dMark03, 100+ fps in UT2k4 and 40fps in Doom 3 at High Quality with 4xAA....it's obviously still a very powerful card.

To get unplayable framerate in HL2 with no FSAA just smacks of inconsistency with the other tests we've seen.
 
Moloch said:
Prove me wrong, you should be name raped to NV3X lover, since no one in their right mind thinks it's anything about a glorified DX8.1 card.
You failed to even prove what you originally said:

Moloch said:
Simple stuff, but the fx series can't run dx9 stuff at any decent rate.
Yes, it can. It's obvious you have never owned a NV35/NV38 (5900/5950U), but I have. It played all the DX9 games I had just fine (Far Cry, Halo, TR:AoD) at very decent framerates. Of course other cards were faster, but what you said right above is completely and totally wrong.

How about you prove otherwise? :rolleyes: You can't because you're just a ranting basher. :90k: :D
 
dderidex said:
I never said I didn't expect them to take a huge hit - it's common knowledge the shaders on them aren't EXACTLY 'state of the art'. Still, to get 7K in 3dMark03, 100+ fps in UT2k4 and 40fps in Doom 3 at High Quality with 4xAA....it's obviously still a very powerful card.

To get unplayable framerate in HL2 with no FSAA just smacks of inconsistency with the other tests we've seen.

well... UT2k4 isn't even a directx 9 game, so I don't know why you're bringing that one up. Neither is Doom3. that leaves you with 3dmark03. Don't you remember the huge flap last year about how poorely the nv3x did in that when the benchmark first came out? nvidia said the benchmark was flawed, but many said 3dmark03 was just ahead of it's time, showing how little shader power the nv3x had. Of course you get above 7k now, after nvidia has been optimizing it for years.
I can't believe someone here has not yet pointed out you're trying to use 3dmark to predict game performance. How can you be on this forum and not have heard at least once "3dmark does not predict real world game performance?"

In any case, I wouldn't worry too much. Supposedly directx 8.1 looks almost as good as directx 9 in HL2.
 
pxc said:
You failed to even prove what you originally said:

Yes, it can. It's obvious you have never owned a NV35/NV38 (5900/5950U), but I have. It played all the DX9 games I had just fine (Far Cry, Halo, TR:AoD) at very decent framerates. Of course other cards were faster, but what you said right above is completely and totally wrong.

How about you prove otherwise? :rolleyes: You can't because you're just a ranting basher. :90k: :D

Moloch, you do bs like this all the time, stop it ya flamming... insert words with negative connotations here.

~Adam
 
I had a 5900XT for a little while. I was very surprised how well it did in certain situations. Everyone told me it sucked, but I was getting higher performance in Doom3 than a 9800XT. I was also getting high performance in games like Neverwinter, Morrowind, Wizardry, Painkiller, UT2k4. I don't know why everyone badmouths this card so much. I paid $180, and it overclocked to 5950 levels. This put me right up there with $300 cards.

I think a lot of people look at 1 or 2 worst case scenarios, and judge the card based on that. For the most part, the card was very nice.

I also think people feel the need to badmouth it because of the whole 5800 series fiasco. People get confused and think the 5800 series isn't any different from the 5900 series, and that all FX cards are the same. Well there is a vast difference. The 5800 series was total crap, and the 5900 series isn't. it's as simple as that.

I own a 6800OC now, but only because my 5900XT died on me. If that hadn't happened, i'd still be happily gaming on my 5900XT. And no, I didn't notice a very large difference going from the 5900XT to the 6800OC. There was a difference for sure, but it wasn't huge.
 
dderidex said:
I never said I didn't expect them to take a huge hit - it's common knowledge the shaders on them aren't EXACTLY 'state of the art'. Still, to get 7K in 3dMark03, 100+ fps in UT2k4 and 40fps in Doom 3 at High Quality with 4xAA....it's obviously still a very powerful card.

To get unplayable framerate in HL2 with no FSAA just smacks of inconsistency with the other tests we've seen.

No it doesnt. Doom3 isnt DX9, UT2k4 isnt DX9, and 3dmark2k3 scores can be inflated.

Name another heavy PS 2.0 game that the 59xx series does well. Farcry is the only other heavily used DX9 game out, and it gets stomped by the 9800 series in heavy PS2 usage.

The 59xx cards are good at a lot of things, and very poor at a lot of things. Its only going to get worse when more games come out that use a lot of PS2.
 
Jonsey said:
.............

In any case, I wouldn't worry too much. Supposedly directx 8.1 looks almost as good as directx 9 in HL2.

It looks just as good, just slightly different. Most people couldn't pick out which images were which if they were unlabeled.


Jicks said:
I had a 5900XT for a little while. I was very surprised how well it did in certain situations. Everyone told me it sucked, but I was getting higher performance in Doom3 than a 9800XT. I was also getting high performance in games like Neverwinter, Morrowind, Wizardry, Painkiller, UT2k4. I don't know why everyone badmouths this card so much. ..........
I think a lot of people look at 1 or 2 worst case scenarios, and judge the card based on that. For the most part, the card was very nice...........

It was/is a fad that took off, to bad mouth the 59xx series. Mainly by those that never used the cards.
 
pxc said:
You failed to even prove what you originally said:

Yes, it can. It's obvious you have never owned a NV35/NV38 (5900/5950U), but I have. It played all the DX9 games I had just fine (Far Cry, Halo, TR:AoD) at very decent framerates. Of course other cards were faster, but what you said right above is completely and totally wrong.

How about you prove otherwise? :rolleyes: You can't because you're just a ranting basher. :90k: :D
Why would I own such a piece of trash? Are you insane? It has shitty FSAA and AF, why would anyone in there right mind own one?
You're killing me her(laughing so hard)
The only way it plays DX9 games decently is shader replacement, get that through your thick skull!
And doom3 runs ok on FX series because carmack coded it to run well on cards that lack serious math power.
It has better FSAA performance because it uh.. it horrible
Why wouldn't I bash such a poor product? You claim it' s obvious I never owned one, but why would I? I'm not some nvidia pr wanna be who thinks the the NV3X series was decent, the NV 35 may have improved DX9 performance, but a 9600XT still runs source based games better, funny, mid range card crushing a high end card. :D
 
fallguy said:
No it doesnt. Doom3 isnt DX9, UT2k4 isnt DX9, and 3dmark2k3 scores can be inflated.

Doom 3 isn't DX9, no, but AFAIK the ARB2 path makes extensive use of shaders similar to DX9 shaders.

And while 3dMark scores CAN be inflated, there is no evidence they have been. Hell, the last driver release from nVidia (66.93?) just got Futuremark approved as containing no cheats or invalid optimizations, so....

Name another heavy PS 2.0 game that the 59xx series does well. Farcry is the only other heavily used DX9 game out, and it gets stomped by the 9800 series in heavy PS2 usage.
But it's still quite playable is the point.

And it does great in Tomb Raider, Painkiller, Halo, Far Cry, etc.

Again, my point is not that this should be faster than ATI's cards or anything - of course I'm not suggesting that.

However, to say that this game can run fine on a Radeon 9600 Pro in DX9 mode while a GeForce FX 5950 provides unplayable framerates in DX9 mode just boggles the mind. Hell - the 5950 beats the 9600 XT in even Far Cry! I mean, look at those charts I linked - the 5950 is always faster than the 9600 XT - always faster, every game, regardless of WHAT shaders are used. And for Valve to then come along and say the 9600 Pro works fine in DX9 mode but the 5950 does not.....that just seems impossible.

The 59xx cards are good at a lot of things, and very poor at a lot of things. Its only going to get worse when more games come out that use a lot of PS2.
I disagree. Properly coded PS2.0 games can work just fine on the 5950.
 
Moloch said:
Why would I own such a piece of trash? Are you insane?
nothing you post is accurate or even worth reading. bye.
 
Moloch said:
And doom3 runs ok on FX series because carmack coded it to run well on cards that lack serious math power.
Ah, that must explain why the 5950 wins over the 9800 Pro in ALL these games, and frequently beats the 9800xt, then, too, right? (Notice that I'm pulling from 3 different review sites to eliminate claims of reviewer bias):I could keep going, if you want?

I mean, it sure looks like the FX 5950 is 'crap' to me, doesn't it? It only pwns the 9800 Pro in every one of those games, and beats the 9800xt in the bulk of them. If that's 'crap', what would you call the 9800?

Now, just to be clear, I'm not some blind fan-boy (unlike some other posting in this thread *cough* not naming names or anything, just saying *cough*) - I fully realize that the 9800 is a great card and beats the 5950 in just as many games as it loses in.

But that's the point - they are equivelant cards. One is not 'crap' compared to the other. Depending on what games you play, the 5950 may well give you better performance than the 9800 - or, the opposite may be true. They ARE equivelant cards, like it or not.
 
pxc said:
nothing you post is accurate or even worth reading. bye.
So you think the valve, FM, and the guy who makes shadermark are all in a plot to make the otherwise excellent FX series look like shit?
dderidex said:
Ah, that must explain why the 5950 wins over the 9800 Pro in ALL these games, and frequently beats the 9800xt, then, too, right? (Notice that I'm pulling from 3 different review sites to eliminate claims of reviewer bias):I could keep going, if you want?

I mean, it sure looks like the FX 5950 is 'crap' to me, doesn't it? It only pwns the 9800 Pro in every one of those games, and beats the 9800xt in the bulk of them. If that's 'crap', what would you call the 9800?

Now, just to be clear, I'm not some blind fan-boy (unlike some other posting in this thread *cough* not naming names or anything, just saying *cough*) - I fully realize that the 9800 is a great card and beats the 5950 in just as many games as it loses in.

But that's the point - they are equivelant cards. One is not 'crap' compared to the other. Depending on what games you play, the 5950 may well give you better performance than the 9800 - or, the opposite may be true. They ARE equivelant cards, like it or not.
Why are you only showing 1024 :confused:
3dmark2003? Are you joking?
The 5950 will win in games that either opengl, or don't require alot of DX9 power, btw, far cry isn't a heavy DX9 title.
The fx series has laughable FSAA btw, so you can't compare 4x ati fsaa to nvidia 4x, ati's 2x should be used.
Load up some DX9 games and let's see who'd winning.
You nvidia guys are in a total denial, you are using DX8.1 games and/or nvidia sanctioned games to prove your point, most of the games have such high framerates(at the lowly 1024 res) that the shitty texture filterting and fsaa of the FX is the breaker for the FX series
 
It's no question that the FX series was horrible in Dx9, alright yes we get it, we got it when Brent said it that thousandth time as well.

However, all DD is saying is that the 5900 series is decent enough to play any game out with full dx9 and I agree, but the frame rates are extremely lacking IMO.

~Adam
 
Jicks said:
I had a 5900XT for a little while. I was very surprised how well it did in certain situations. Everyone told me it sucked, but I was getting higher performance in Doom3 than a 9800XT. I was also getting high performance in games like Neverwinter, Morrowind, Wizardry, Painkiller, UT2k4. I don't know why everyone badmouths this card so much. I paid $180, and it overclocked to 5950 levels. This put me right up there with $300 cards.

I think a lot of people look at 1 or 2 worst case scenarios, and judge the card based on that. For the most part, the card was very nice.

And no, I didn't notice a very large difference going from the 5900XT to the 6800OC. There was a difference for sure, but it wasn't huge.

Again, none of those games you mentioned were directx 9. (maybe painkiller, but I don't think so.) You're right, the r3xx cards were only vastly better than the nv3x cards in a couple of cases, because there were only a few games that had directx 9 support. And if you didn't notice a big difference going from the 5900xt to the 6800, well, congratulations! You're easy to please!

Before, the hoopla about the nv3x not being good it directx 9 didn't mean much... because there wern't that many games that needed it. Now that directx 9 is being used more, the card suffers. Why complain? You got a year or two of good use out of the card, and even then you knew it wasn't future proof. A year is a long time in the video card industry. :cool:
 
Status
Not open for further replies.
Back
Top