Lost Planet: Post your Snow & Cave results

Eastcoasthandle

[H]ard|Gawd
Joined
Jan 27, 2006
Messages
1,041
Side note before you play this game on XP:
If you have a xinput1_3.dll error download it here and install it in C:\windows\system
If you have a d3dx9_33.dll error download it here and install it in C:\windows\system32
When configuring your keys do not use the mouse button to execute them as it may throw you back to desktop. Use the enter button instead (I used the enter button from the keypad, if that makes a difference...)
:D

 
Or you can just install the DX9 April 07 runtime update.

I have an older system, Opty [email protected]. 2gigs of ram, X1900XT at stock clocks. Mipmap set to highest, and HQ AF on.

Stock game settings other than 4xAA/16xAF. 1280x720.

Cave: 37
Snow: 34
 
The 2900XT running the 8.37.4 video drivers wont render smoke, fire, or the falling snow. Not sure how websites have done their benchmarks with Lost Planet DX10 and have a valid test. Unless they are running the 8.38 drivers which regular joe doesnt have access to yet.
 
I doubt optimization will result in a 100% performance increase.

Indeed - and even if it did it'd be behind the GTX. Seriously, why even bother taking a chance on ATi when the Nvidia cards can be seen to work perfectly?
 
I am sure that when the drivers are mature we will see a nice bump in performance. 100% is not a requirement LOL

LOL because the HD2900XT was never intended to be a flagship card competing against the GTX. Neither was the 1900XT/1950pro ever a GTX contender, that's common sense.
 
I am sure that when the drivers are mature we will see a nice bump in performance. 100% is not a requirement LOL

LOL because the HD2900XT was never intended to be a flagship card competing against the GTX. Neither was the 1900XT/1950pro ever a GTX contender, that's common sense.

LOL no matter what was originally intended it IS ATi's flagship card right now, as the Ultra is Nvidia's. The Ultra stomps the XT, as do Nvidia's second best card, the GTX, and to a lesser extent their third, the GTS640. Their 4th best card, the GTS320, is beaten by ATi's number 1 in some tests, maybe you could say most, but that's really nothing for the red team to be proud of.

The 1900XT was almost exactly on a par with the 7900GTX, I don't know where you're getting the information it wasn't? Besides this just about any X1900XT could clock to XTX speeds, I have two doing just that. The Pro of course wasn't as fast but that's a different card altogether.
 
LOL no matter what was originally intended it IS ATi's flagship card right now, as the Ultra is Nvidia's. The Ultra stomps the XT, as do Nvidia's second best card, the GTX, and to a lesser extent their third, the GTS640. Their 4th best card, the GTS320, is beaten by ATi's number 1 in some tests, maybe you could say most, but that's really nothing for the red team to be proud of.

The 1900XT was almost exactly on a par with the 7900GTX, I don't know where you're getting the information it wasn't? Besides this just about any X1900XT could clock to XTX speeds, I have two doing just that. The Pro of course wasn't as fast but that's a different card altogether.

LOL, I don't see that written in the official website Where are you getting your information from besides hearsay?
 
Is it just me, or do they have the Zalman on that ATI system oriented so it's directly sucking the heat off of the video card. Wouldn't that heat up your processor kinda? I mean, specially with the oven the 2900xt produces.
 
LOL no matter what was originally intended it IS ATi's flagship card right now, as the Ultra is Nvidia's. The Ultra stomps the XT, as do Nvidia's second best card, the GTX, and to a lesser extent their third, the GTS640. Their 4th best card, the GTS320, is beaten by ATi's number 1 in some tests, maybe you could say most, but that's really nothing for the red team to be proud of.

The 1900XT was almost exactly on a par with the 7900GTX, I don't know where you're getting the information it wasn't? Besides this just about any X1900XT could clock to XTX speeds, I have two doing just that. The Pro of course wasn't as fast but that's a different card altogether.

actually the GTS 320 OC cards beat the 640cards sometimes too!
 
LOL, I don't see that written in the official website Where are you getting your information from besides hearsay?

I'm not sure what your point is here. The fact that the 2900XT is ATi's flagship card is self-evident. If your view is that they have a better card to be their flagship product, please provide a link where I can buy one.
 
I'm not sure what your point is here. The fact that the 2900XT is ATi's flagship card is self-evident. If your view is that they have a better card to be their flagship product, please provide a link where I can buy one.
LOL, oh really? It's evident you understand exactly what my previous post says. But you stick to what's evident to you if you like. In the absences of truth it's very apparent that your self-evidence is nothing more then just an opinion, not a fact.
Anyone else have any FPS results from this game?
 
Indeed - and even if it did it'd be behind the GTX. Seriously, why even bother taking a chance on ATi when the Nvidia cards can be seen to work perfectly?

well if you look at preliminary call of juarez benches, they showed the gtx with abysmal fps compared to playable fps on the 2900xt.
now if you look at the latest CoJ benches with better nvidia drivers they show the gtx winning against the 2900xt. these weren't 'merely' 100% increases with better optimized drivers for nvidia but more like 4000% increases.
it just goes to show how these inaugural benches with a bias really mislead all consumers, not just detractors but supporters too.
taken with a grain of salt anyone? if these games arent even out yet, give it time. all these advance uniquely-optimized-for-specific-hardware demonstrations are the equivalent of muckraking in the world of journalism.
 
To the results...all I can say is WOW

NVidia dominates ATi in those tests...and NO driver update is going to bring a card up 40-60 FPS.

So it is confirmed...r600 = FX5900

It doesn't matter though as (hopefully) the refresh for ATi will be out before major DX10 titles hit
 
[RCKY] Thor;1031066157 said:

yeah, saw this last night. pretty embarrassing performance from ati at the moment. given their pr man's preamble, they obviously got caught with their pants down here. same thing happened to nvidia on the CoJ preliminary bench, but once they had time to react and tailor better driver support it looked a lot better.

however, and not that this should save face for ati's weak sauce fps, just looking at those screens shows that the cards from both camps aren't even trying to do the same things. i've played lost planet on the 360 and the motion blurring is quite dense, even irritates my eyes unless i back up far enough from the 57" hdtv. the 2900xt screenshots convey this effect much more accurately than the gtx. in fact it doesnt look like any motion blurring is going on at all in the gtx shot, does it?
dont get me wrong here, motion blur is obnoxious and i PREFER the gtx screen... but when it comes to specifically displaying intense effects designed to tax a top of the line gpu, the comparison doesnt seem fair. it's like timing two athlete's in the 40 yard dash but one of them running waist deep in a pool. gee i wonder who would win, the dry guy or the soggy guy.
 
yeah, saw this last night. pretty embarrassing performance from ati at the moment. given their pr man's preamble, they obviously got caught with their pants down here. same thing happened to nvidia on the CoJ preliminary bench, but once they had time to react and tailor better driver support it looked a lot better.

however, and not that this should save face for ati's weak sauce fps, just looking at those screens shows that the cards from both camps aren't even trying to do the same things. i've played lost planet on the 360 and the motion blurring is quite dense, even irritates my eyes unless i back up far enough from the 57" hdtv. the 2900xt screenshots convey this effect much more accurately than the gtx. in fact it doesnt look like any motion blurring is going on at all in the gtx shot, does it?
dont get me wrong here, motion blur is obnoxious and i PREFER the gtx screen... but when it comes to specifically displaying intense effects designed to tax a top of the line gpu, the comparison doesnt seem fair. it's like timing two athlete's in the 40 yard dash but one of them running waist deep in a pool. gee i wonder who would win, the dry guy or the soggy guy.

IF you read the NVidia part you would see that there is indeed motion blurring..

AND if you read FURTHER for the cave part they compared ATi without motion blur and with and the improvement was only 5 or so FPS
 
i DID read. however is it not plain to see that the effect is radically different just by using your eyes? i mean i guess if i have to break my point down into yet another analogy it's like using someone else's glasses with a completely different prescription.

anywho, all i'm saying is i don't really expect ati to fix this and entirely flip the script performance wise and come out in front, but this is totally reminding me of some early CoJ benches.

look:
http://bp2.blogger.com/_BabjUDZIqPw/RjL0iGGObII/AAAAAAAAACs/omHwk7C2K8c/s1600-h/004.jpg

when all other benches show that the gts is in fact comparable to the 2900xt [which itself is comparable to a gtx in some games] ad nauseum[!], you can tell when something is bananas.
 
i DID read. however is it not plain to see that the effect is radically different just by using your eyes? i mean i guess if i have to break my point down into yet another analogy it's like using someone else's glasses with a completely different prescription.

anywho, all i'm saying is i don't really expect ati to fix this and entirely flip the script performance wise and come out in front, but this is totally reminding me of some early CoJ benches.

look:
http://bp2.blogger.com/_BabjUDZIqPw/RjL0iGGObII/AAAAAAAAACs/omHwk7C2K8c/s1600-h/004.jpg

when all other benches show that the gts is in fact comparable to the 2900xt [which itself is comparable to a gtx in some games] ad nauseum[!], you can tell when something is bananas.

No I understand that completely, I was however stating that blur or no blur ATi didnt improve much and was still 60 or so FPS behind.

I for one think that its a graphics glitch for ATi or they are way overdoing the blur effect (it was on low...), I actually like NVidias High blur...the blur seems much more realistic and NVidia still holds a solid 80 FPS
 
The linked article clearly states that the ATi card is not rendering several things. It is most assuredly a glitch, not a choice.
 
man this game makes my 1900xtx cry for mercy and the fan spins 90% just to keep it under 75c

anyways.

Snow: 42
cave: 34

system specs below
 
yeah, saw this last night. pretty embarrassing performance from ati at the moment. given their pr man's preamble, they obviously got caught with their pants down here. same thing happened to nvidia on the CoJ preliminary bench, but once they had time to react and tailor better driver support it looked a lot better.

however, and not that this should save face for ati's weak sauce fps, just looking at those screens shows that the cards from both camps aren't even trying to do the same things. i've played lost planet on the 360 and the motion blurring is quite dense, even irritates my eyes unless i back up far enough from the 57" hdtv. the 2900xt screenshots convey this effect much more accurately than the gtx. in fact it doesnt look like any motion blurring is going on at all in the gtx shot, does it?
dont get me wrong here, motion blur is obnoxious and i PREFER the gtx screen... but when it comes to specifically displaying intense effects designed to tax a top of the line gpu, the comparison doesnt seem fair. it's like timing two athlete's in the 40 yard dash but one of them running waist deep in a pool. gee i wonder who would win, the dry guy or the soggy guy.

The ATI implementation of the motion blur is overexaggerated. I've seen both in person and I can tell you that the ATI is borked, while the GTX/GTS motion blurs only when relevant (the right times). The ATI picture LOOKS like "oh theres motion blur" but trust me, it's going the whole time, no matter whether you're moving or not lol. It's corruption.
 
There seems to be a CF bug:

X1900 Crossfire - Snow 26, Cave 29
X1900 Solo - Snow 28, Cave 32

Both with Opty 170 @ 2.4, 2G RAM, XP x64, X1900s @ 655/725, 4xQAAA, 16xHQAF, others at default
 
There seems to be a CF bug:

X1900 Crossfire - Snow 26, Cave 29
X1900 Solo - Snow 28, Cave 32

Both with Opty 170 @ 2.4, 2G RAM, XP x64, X1900s @ 655/725, 4xQAAA, 16xHQAF, others at default


Did you enable multiple gpus in the benchmark settings?
 
x1800xt 512 @ P.E. Clocks (default in-game settings):

Snow: 25 FPS
Cave: 25 FPS

Man this game kicks my computer's ass.
 
lol this game runs like crap on my machine..got like ~18 fps
@1360x1016 reso (even though i changed it to 1680x1050 in the config??)

Windows XP Professional SP2
AMD 4200 X2
X1800GTO (7.4 driver)
2GIG RAM DDR400
 
Interesting. Perhaps, there is a bug

I don't think it's the demo, I think Cat 7.4 is fubared. I've just benched Doom3, FEAR and Dungeon Siege at 1600x1200 with everything on and none of them show any difference with Crossfire on. To be honest it might have been broken for a bit, I don't often play games that actuallly need both cards. I'll bench with Cat 7.3 and post back.
 
Cat 7.4 is indeed useless, at least the x64 version is. With 7.4 I get max 63FPS in FEAR with one card or both, with 7.3 I get 170 with one card and 210 with two. I can't run the LP demo on Cat 7.3 though, I get a corrupt screen. It runs on 7.4 but not in true CF mode. At this point I'm ready to believe the 2900XT IS just limited by crap drivers - the ones for the X1900 are rubbish and that's been out a year!

Edit - single card cat 7.3 results: 37/35 - that's a 50% improvement on Snow. If I can actually get CF working I should be able to get a smooth 60FPS in both tests...stupid ATi drivers!
 
Back
Top