2900xt and Bioshock.

you have overdrive enabled....Everytime i try to enable that feature I get huge performance drop.
 
Heres the other sceen you posted at the top of the elevator.
boishock4.jpg

I used the overdrive feature "run automated clock configuration utility" and it set it to GPU 858mhz - Memory 898mhz. Thats from GPU 743mhz - Memory 828mhz
 
I ran the utiltiy before, and it maxed out the settings, but when i did 3dmark06 and call or juarez i noticed that my performance would lag, and thats with no AA or AF
 
AMD Athlon 64 X2 6000+ *XP-90*
ASUS M2N32 SLI Deluxe
OCZ Platinum Revision2 (2x1GB)4-4-4-12 T1 DDR2 800
ATI Radeon HD 2900XT 512MB
Seagate Barracuda 7200.8 200gb Serial ATA
Sound Blaster Audigy 2 ZS
OCZ GameXStream 600W
Alienware Case (old Chieftec style)
Klipsch ProMedia ultra 5.1
NEC Multisync FE2111sb

All stock speeds except the Overdrive. Also I had 5.1 sound with all the Creative EAX audio on with these screens if that matters.
 
I know nvidia optimises it's drivers also for individual games and nothing wrong with that, but with ati it seems that it's almost unplayable before getting some kind of fix or tweak in the drivers.
Methinks in this instance it might have something to do with prior access to the game. Or I assume such since the demo actually has an NVidia splash screen in it. ;) But what does it matter if the hotfix is out the day of or even the week of release? *shrug* Do you really think that NVidia didn't tweak as well?

P.S. I know bandwidth is a consideration, but I'd like to see these screenshots in PNG. JPEG does so mess with things. Take for example that first 8800 screenie coming out of the elevator. I very much doubt that those halos of distortion appear around the head of the pipe wrench in the original. For my part I'll see if I can put up a likewise screen from my 2900XT later today or tomorrow. I just have to figure out a spot that I won't immediately kill my bandwidth on. How many views do people find they get around these parts? So I can budget for posting pics.
 
Just wanted to post this for you guys. Google "bioshock rootkit" you may be infected with Sony's SecuROM crap, even from the demo. From what I've gathered, it could depend on where you got it from. I got my demo from Nvidia's site. Other people got it from STEAM as well and some have reported the reg. entries (positive matches). So be careful. I'm sure this stuff in the retail copy as you only get 5 installs, but crap like this shouldn't be in FREE SOFTWARE. I checked mine and I'm not "infected", although there are plenty (who got just the demo) who are.
 
Just wanted to post this for you guys. Google "bioshock rootkit" you may be infected with Sony's SecuROM crap, even from the demo. From what I've gathered, it could depend on where you got it from. I got my demo from Nvidia's site. Other people got it from STEAM as well and some have reported the reg. entries (positive matches). So be careful. I'm sure this stuff in the retail copy as you only get 5 installs, but crap like this shouldn't be in FREE SOFTWARE. I checked mine and I'm not "infected", although there are plenty (who got just the demo) who are.

I got the Steam version of the full game and it looks like the root kit wasn't installed.
 
I get display corruption from time to time, but I just reboot and it works fine.

Otherwise I have no issues runnin in 1680X1050 res max details 16xAF HQ. Never dropped below 30fps
 
Hmmm.....i'm thinking of getting this card (or 2):p

how do you think 1 of these would compare to my x1900xt crossfire setup?

1 run everything (that will run at) 1920x1200, how does this card stack up?

i know the GTX is faster but that is apparently for dx 9 titles, and dx 10 titles supposedly close the gap a bunch.

Do the newer drivers close the gap with nvidia?

thanks all :p
 
Hmmm.....i'm thinking of getting this card (or 2):p

how do you think 1 of these would compare to my x1900xt crossfire setup?

1 run everything (that will run at) 1920x1200, how does this card stack up?

i know the GTX is faster but that is apparently for dx 9 titles, and dx 10 titles supposedly close the gap a bunch.

Do the newer drivers close the gap with nvidia?

thanks all :p

newer drivers haven't improved anything.

http://www.hardforum.com/showthread.php?t=1215427

Don't bother upgrading right now unless you really need some extra performance. DX 10 hasn't really been added to any games to add a significant difference in IQ or performance.
 
newer drivers haven't improved anything.

http://www.hardforum.com/showthread.php?t=1215427

Don't bother upgrading right now unless you really need some extra performance. DX 10 hasn't really been added to any games to add a significant difference in IQ or performance.

thats what i was wondering, i'm looking to get an equivalent (or close to) the performance of the x1900xt's while cutting the power down, as i'd much rather have one card right now than 2 just for powers sake.
i really dont want to go the nvidia route as my board is already a crossfire board and i think i'm ok on the CPU for a while before i jump to a quad sometime around the end of the year.

this card seems to be getting good reviews at the 'egg even though i take user reviews with a grain of salt.

thanks all :p
 
Just wanted to post this for you guys. Google "bioshock rootkit" you may be infected with Sony's SecuROM crap, even from the demo. From what I've gathered, it could depend on where you got it from. I got my demo from Nvidia's site. Other people got it from STEAM as well and some have reported the reg. entries (positive matches). So be careful. I'm sure this stuff in the retail copy as you only get 5 installs, but crap like this shouldn't be in FREE SOFTWARE. I checked mine and I'm not "infected", although there are plenty (who got just the demo) who are.
From what I gather it isn't a "rootkit" at all. :rolleyes: It is just a service. A perfectly valid and very commonplace type of executable that protects the OS kernel by allowing regulated and specific admin/root type access for tasks initiated by the user.

But anti-virus/anti-spyware tend to get overzealous at times. I get tech support calls like that from time to time on this with some software I wrote/maintain where for some reason PC-cillin or something added something new to their "wanted" list and freaked out. The most extreme example in the news recently was antivirus eating a critical system file from the Chinese version of Windows XP (IIRC) and turning machines into unbootable bricks.
 
SuperKeijo said:
I know nvidia optimises it's drivers also for individual games and nothing wrong with that, but with ati it seems that it's almost unplayable before getting some kind of fix or tweak in the drivers.

Its called Application Detection. ATI and Nvidia do it.
 
I've been running the game, 16x12 with all settings on high, on my 1950xtx and getting between 40-75 fps about 98 percent of the time. Really runs smooth for me and looks great.
 
I have a question for you 2900 XT owners: Have you been able to get AA while running Bioshock in DX9 mode? I have the hotfix, and it just doesn't seem to actually work (AA that is).

And BTW: The game looks and runs fabulously on the 2900 XT. I won't say how it compares to the 8800s in my testing, but I will say that there are no performance or image quality problems with the 2900 XT that I have been able to detect.

EDIT: Nevermind. Found the fix. :|
 
I have a question for you 2900 XT owners: Have you been able to get AA while running Bioshock in DX9 mode? I have the hotfix, and it just doesn't seem to actually work (AA that is).

And BTW: The game looks and runs fabulously on the 2900 XT. I won't say how it compares to the 8800s in my testing, but I will say that there are no performance or image quality problems with the 2900 XT that I have been able to detect.

EDIT: Nevermind. Found the fix. :|

I've seen 8800 and the 2900XT side by side.. the game looks a bit washed out on ATI for some reason. it literally does look more vivid to me on the 8800. There's no disputing ATI's DX9 performance lead though.. even though they get owned DX10.
 
Is it because of ATI's drivers that there is no AA in dx10 yet? So they will eventually have it?

I have to game at 1680x1050 and I'm having the hardest time decing on a dx10 card. I have all my parts but the video card and I hate waiting just for one part to get my new rig going. I almost want to get the cheapest dx10 card out there just to be able to run vista64 and I'll just won't play any games until a good dx10 card comes out. I'm not about to make a $300-500 mistake based on what I think future drivers might do.
 
Is it because of ATI's drivers that there is no AA in dx10 yet? So they will eventually have it?

I have to game at 1680x1050 and I'm having the hardest time decing on a dx10 card. I have all my parts but the video card and I hate waiting just for one part to get my new rig going. I almost want to get the cheapest dx10 card out there just to be able to run vista64 and I'll just won't play any games until a good dx10 card comes out. I'm not about to make a $300-500 mistake based on what I think future drivers might do.

That's why I've always - always - bought motherboards with onboard video and DVI when making my system builds. I upgrade CPUs and ram over time, sure, but not NEARLY as often as I ugprade my video card. And having a good onboard solution means you always have a fallback to keep your OS running for work (and such). Heck, I even played pretty big parts of 'Doom 3' on an ATI Xpress200 onboard. :D (Honestly, it didn't suck as much as you'd think. Sure, I was at 640x480 with FSAA and AF off...but all the in-game options were on 'medium' or 'high', and it still looked good.)

ANYWAY, if you do not have that option, having an el cheapo video card just on hand is never a bad idea. Handy to have 'between upgrades', useful if your main card dies for some reason, etc. And both ATI's Hypermemory and nVidia's Turbocache parts are pretty darn cheap.
 
Is it because of ATI's drivers that there is no AA in dx10 yet? So they will eventually have it?
I don't have the quote handy but a 2K guy said they pulled AA from the release because they were having trouble getting it to look good on the 8800. So no, it wasn't because of the ATI that Bioshock doesn't have DX10 AA. It is just the realities of new software for a new standard. *shrug* Keep in mind that NVidia has only relatively recently gotten their Vista (ergo DX10) drivers stable (as in not crashing or glitching with no screen update for a second or more at a time). Yes, about 9 months after Vista shipped. Little wonder on the dearth of DX10 enabled games, especially ones taking full advantage. ATI hadn't released hardware and NVidia hadn't release stable software. But with both of those now address I'd be surprised if we didn't see a Bioshock patch before too long with DX10 AA.
I have to game at 1680x1050 and I'm having the hardest time decing on a dx10 card. I have all my parts but the video card and I hate waiting just for one part to get my new rig going. I almost want to get the cheapest dx10 card out there just to be able to run vista64 and I'll just won't play any games until a good dx10 card comes out. I'm not about to make a $300-500 mistake based on what I think future drivers might do.
*shrug* Seems a pretty good bet given how DX9 performance has shaken out over the last few months.
I've seen 8800 and the 2900XT side by side.. the game looks a bit washed out on ATI for some reason. it literally does look more vivid to me on the 8800. There's no disputing ATI's DX9 performance lead though.. even though they get owned DX10.
The same textures come out differently on one or the other. Sometimes that means it looks washed out on the 2900 and other times it'll look overly dark on the 8800. However you can mostly compensate for this by different monitor settings. For optimal image quality you should be tweaking your monitor settings from card to card.
 
I just started bioshock last night and stop at the Arcadia level,my system runs this game with max detail and forced global lighting smooth at 2560x1600
 
the bottom system I have listed,a x1900xtx@675/801 and a c2d quad @3.4 ram @953,at level 8 now and Bioshock continues to run smooth to this point
I read it is the same engine as Rainbow 6 vegas but it ran best at 1920x 1200 max detail and 4x filtering
 
the bottom system I have listed,a x1900xtx@675/801 and a c2d quad @3.4 ram @953,at level 8 now and Bioshock continues to run smooth to this point
I read it is the same engine as Rainbow 6 vegas but it ran best at 1920x 1200 max detail and 4x filtering

I find it hard to believe that a x1900xtx does that well @ 2560x1600 when even a 8800 Ultra gets 44.6 avg and 25min fps at the same resolution.

What settings are you using exactly and what fps?

And do you use the good lighting, or the bad lighting?
 
everything set at max with forced global lighting,don't know the frame rate numbers as it doesn't matter to me,the game runs smooth and I see or feel no lag
believe it or not
 
Ah well you are using the lower lighting setting so I could see it being more playable.

Global lighting on = Forces the game to not use dynamic lighting.
 
I did a comparison of the two settings and global looked better at install,was just a brief run though,I'm gonna try the other option for a while but if that causes lag I may resort to the lower option. I didn't notice much performance difference during my install runs
thanks for the info
 
a quick run with the better lighting at last check point,it runs as smooth maybe smoother
hope it doesn't change
all games need to run this smooth,looks good too
these x1900's seem to be able to run low frame rates smoothly
 
I just bought a 2900XT 512mb card even after reading all the reviews, and seeing the performance charts comparing the 8800's. Some reviews hail the 2900, and some bash it, but I know from being in the industry that some reviews are always going to be biased, and some might well tell the truth. I did notice that the 640mb GTS was in front of the 2900 in some games, but I bought mine for alot of other reasons. The 2900 might not beat the GTX but it's also no slouch. It's far from, and at $385 CND (no tax) it fit the bill for a nice price/performance. I don't really care that the 8800's are achieving 80fps in bioshock and the 2900 may sit at 60, maybe higher, I haven’t had a chance to test mine out, but I know it's going to run smooth at high settings. To each his own, I'm not biased to either ATI nor nVidia, I make my purchase on whatever fits my needs and remains the most cost effective. In the past ATI has had the edge, maybe nvidia has it now, but point being, the 2900XT and 8800 cards are both strong competitors. No reason to bash either card.
 
I just started bioshock last night and stop at the Arcadia level,my system runs this game with max detail and forced global lighting smooth at 2560x1600

Forced global lighting on is a LOW spec tweak.. for best visual quality you want that OFF.

Not very intuitive by the developers, I know.
 
I just bought a 2900XT 512mb card even after reading all the reviews, and seeing the performance charts comparing the 8800's. Some reviews hail the 2900, and some bash it, but I know from being in the industry that some reviews are always going to be biased, and some might well tell the truth. I did notice that the 640mb GTS was in front of the 2900 in some games, but I bought mine for alot of other reasons. The 2900 might not beat the GTX but it's also no slouch. It's far from, and at $385 CND (no tax) it fit the bill for a nice price/performance. I don't really care that the 8800's are achieving 80fps in bioshock and the 2900 may sit at 60, maybe higher, I haven’t had a chance to test mine out, but I know it's going to run smooth at high settings. To each his own, I'm not biased to either ATI nor nVidia, I make my purchase on whatever fits my needs and remains the most cost effective. In the past ATI has had the edge, maybe nvidia has it now, but point being, the 2900XT and 8800 cards are both strong competitors. No reason to bash either card.

Well that's not really entirely true. If the 2900XT was a well rounded card that:

1. Didn't drop to it's knees like a saigon wh*** when AA was turned on

2. Actually got decent crossfire support so that the latest games actually scale on ATI

3. Ran even remotely close DX10 compared to DX9 performance (currently it's 1/3-1/2 in most games)

then I'd understand why people such as yourself say it's all a matter of preference, etc.

Unfortunately those are problems ATI has to contend with, and the lack of product roundedness is not exactly appealing for the end-user (assuming they find out about it before they buy the product and aren't conned by some douche on the spot at best buy trying to make their sales quota by feeding them lies).

Yes, it sucks, but facts are facts. I'm not trying to say it's a garbage card, it's far from it. But let's be fair here and not fall into this hippy love commune-fest of "oh they both have their strengths and weaknesses, it's just preference". It isn't so simple, and you can't just wish problems like those that I've listed away.
 
Well that's not really entirely true. If the 2900XT was a well rounded card that:

1. Didn't drop to it's knees like a saigon wh*** when AA was turned on

2. Actually got decent crossfire support so that the latest games actually scale on ATI

3. Ran even remotely close DX10 compared to DX9 performance (currently it's 1/3-1/2 in most games)

then I'd understand why people such as yourself say it's all a matter of preference, etc.

Unfortunately those are problems ATI has to contend with, and the lack of product roundedness is not exactly appealing for the end-user (assuming they find out about it before they buy the product and aren't conned by some douche on the spot at best buy trying to make their sales quota by feeding them lies).

Yes, it sucks, but facts are facts. I'm not trying to say it's a garbage card, it's far from it. But let's be fair here and not fall into this hippy love commune-fest of "oh they both have their strengths and weaknesses, it's just preference". It isn't so simple, and you can't just wish problems like those that I've listed away.

Yeah, he says he bought it "for a lot of other reasons" and that it "fits his needs." What reasons? What needs? Why buy an expensive gaming card for anything other than gaming performance?

Comments like this sound to me like, "I want ATi more than nVidia, so I'm going to buy it anyway as long as it doesn't totally suck" with a lot of rationalization slathered on top.

P.S. the hippy lovefest bit was classic, as well as on-target.
 
i don't know what frame rates i get but at 1920x1080, with all the settings on high (AA off of course) global lighting disabled, on my x1950pro it feels pretty fluid. like i was playing quake 4 or doom 3 at 1280x1024 :). i was honestly surprised, i guess i will get a little more out of my x1950pro than i thought ;)
 
I've got this overclocked GTS that beats the 2900XT in everything else, U3 games are still owned by the 2900XT. Playing Bioshock is much smoother on the 2900XT.
 
I've got this overclocked GTS that beats the 2900XT in everything else, U3 games are still owned by the 2900XT. Playing Bioshock is much smoother on the 2900XT.

Owned? All two UE3 released games, one of which is a totally unoptimized console port?

Also how can you make a blanket statement like that? DX9 the 2900 is faster but it gets owned in DX10 by the GTS.
 
Back
Top