BioShock 2 Gameplay Performance and Image Quality @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,664
BioShock 2 Gameplay Performance and Image Quality - BioShock 2 is here for your damp dark pleasure. BioShock, its predecessor, was a feast for the eyes when it was launched back in 2007, but does the aging engine stand up well to today's video cards and today's expectations? We'll check it out with six of today's best video cards. Now, who's your Big Daddy?
 
its like seeing game industry go backwards. no native aa in latest titles great whats next forcing af through control panel ? i believe we will see that either
 
We need to push developers to leave DX9 behind. In Bad Company 2 PC, you can only enable AA in DX10 and DX11, because DICE are decent people who know the benefits of using AA in a DX10/DX11 environment.

Another Swedish studio, Avalance, developers of Just Cause 2, also leave DX9 behind, but if I read the words right they're leaving DX9 behind completely for the PC version, all for the benefits of DX10 and above.

Since there's no website like [H] when it comes to PC gaming, I recommend you guys try to score an interview or two with these pioneers, these people who dare shun the soon nine year old operating system that people can't help but let go of.


e: Another thing the [H] crew has to do for the next engine Epic shits out is to ensure they don't fuck up AA implementation they way they did it with this one, again.
 
Last edited:
One of the things that I want to reiterate is that how truly texture quality is hit and miss in Bioshock 2. Some of the areas look good, while others have walls or floors with such horribly blurry textures with no normal maps and only specular highlighting applied it's like a punch to the eyes. Even when turning down the resolution to the XBox 360/PS3's native 1280x720 or 1280x800 (but why?) these offending textures stand out. If it weren't for this glaring fact along with the FOV issue appearing again,Bioshock 2 would still look decent. Instead we get this. I don't think I'm unreasonable in wanting my $50 back or a patch that adresses this, am I?
 
Wow, they were getting an avg of 51 fps with 16xQAA with a single GTX 275? I have SLI GTX 275's and putting just 4xAA gives me terrible frames and stuttering. Weird.
 
I don't think we'll see the back of DX9c for a long time yet. If it's a console port then it's a DX9c game, even if it says DX10 or 11 on the box it's still a DX9c game with some barely noticeable graphics tweaks that make nearly no difference unless you are looking at static screen shots.

Until the next gen consoles arrive DX9c will remain king.
 
Wow, they were getting an avg of 51 fps with 16xQAA with a single GTX 275? I have SLI GTX 275's and putting just 4xAA gives me terrible frames and stuttering. Weird.

Did you try disabling SLI and see if that makes a difference? L4D2 ran like crap on some crossfire setups until the recent driver updates. Just a thought.
 
Lack of updated engine was partially what kept me from jumping on this as a pre-order. I like a good story but I don't buy updated hardware just because I like giving away money to the manufacturers. It's too bad they didn't go with UE3 but I guess they were trying to keep it console-friendly. :( I might buy this when it's in the Steam bargain bin.
 
Oh 2K isn't just dragging their feet on widescreen, they're side-stepping it. 2K Elizabeth said in no uncertain terms that they will NOT be fixing the FOV issue. They will only be fixing the issues related to aspect ratio, everyone gets to be stuck at the same crappy 75 FOV unless they use work arounds.
 
I just replaced my 4:3 CRT with a 16:10 22" LCD last night. I gotta say the FOV doesn't bother me. I can see what I need too I suppose?
 
Personally, I loved the game after I changed the FOV to about a 100 and mapped it to my 'w' key. As was said in the review, the production values are top notch - great combat, characters and story. Its just a shame that 2K let the FOV issue rear its head again and that they haven't addressed it yet in a patch. I think its the only blemish on an otherwise fantastic title.
 
I'm probably way off base here, but...

Isn't the purpose of having a platform specific Graphics Interface (in the case of Windows, DirectX) so that developers do not have to worry about the nitty gritty of things like AA, AS, etc...? As long as they program using the DirectX standard, and the video card and driver supports this DirectX standard, should it all not come out hunky dory?
 
The two words that best describe Bioshock 2 in my mind are "phoned in".

Now, bear with me for a moment of caveats: I love Bioshock. And so far I have /greatly/ enjoyed Bioshock 2. It has so far delivered at least one memorable jaw-dropping moment--for the benefit of those seeking to avoid spoilers, the part where you first go underwater--and that in and of itself is worthy of praise.

But to be honest: it is painfully obvious that this was a title that was phoned in, using existing assets as a crutch as much as possible. The lack of proper widescreen support is, at this point, an indefensible flaw for which I'll happily download a game rather than support it with my money. And it's not the only flaw from the first game that made its way into the second virtually unchanged--remember that annoying message in the Controls screen that prompts you that you'll lose your changes if you go to set your keyboard controls? Yeah, it's back, and it's identical. A minor annoyance, and it's not the only one, but it's so trivial to fix that for me it sort of embodies the lack of give-a-damn that went into this game.

What really tore it for me though was GFWL. I don't like it, I don't want it, and I resent being forced to use it in order to install the game. Even that I could live with--were it not for the way they inextricably bound they save games to Live. You cannot even save your game without logging into GFWL. Whoever was responsible for that decision needs to never work in this industry again. And the QuickSave seems to be very buggy as a result, in that it seems to only record the first QuickSave made after each hard save.

The binding of the savegame functionality to Live was the last straw for me. I think BS2 is a pretty good game, and I'm not sorry I played it--but I am extremely sorry I paid money for it. Going forward I will not be paying for any games bound to GFWL--period.
 
Woohoo! Low sys reqs means I can play it on my lappy. It's all about the gameplay for me kids. Great graphics is nice for about the first 5 minutes.
 
Woohoo! Low sys reqs means I can play it on my lappy. It's all about the gameplay for me kids. Great graphics is nice for about the first 5 minutes.

Pong and Pit Fall should give you hours of enjoyment then. Check out a cool game called Astroids too.

:p
 
Oh 2K isn't just dragging their feet on widescreen, they're side-stepping it. 2K Elizabeth said in no uncertain terms that they will NOT be fixing the FOV issue. They will only be fixing the issues related to aspect ratio, everyone gets to be stuck at the same crappy 75 FOV unless they use work arounds.

thats a bunch of horsecrap on their part. they claim it will give an unfair advantage in multiplayer..(which blows anyways)....this seems to be the catch-phrase of the day when it comes to altering the fov. its just plain old-fashioned bullshit. the same could be said for anyone weilding a hi-res mouse...techincally they can be more precise...yea whatever. developers still pressing on with the 4x3 aspect ratios is just plain stupid. widescreen has been the norm for MANY years now. with widescreen comes different sizes...translation.....we need an fov adjustment, or at least make it auto-adjustable depending on the resolution you choose. developers need to take their heads out of their asses once in a while and pay attention to what we need to make a game good instead of what they want to shovel us. i can understand if the game is older, but this is a fairly new title. no excuses. 2k can go take a flying leap if thats the attitude they are going to take with their titles. i simply wont buy them.
 
Rofl, how will the GTX285/275 "give me better gaming experience for the time being" when enabling 2x MSAA drops the average FPS to around 40 and min. FPS to around 20?

I don't know about the reviewers, but I'd take smooth 60 min. FPS gameplay with no AA from the 5870 over 20 min. FPS gameplay with 2XAA from the GTX285 any day.
 
I'm probably way off base here, but...

Isn't the purpose of having a platform specific Graphics Interface (in the case of Windows, DirectX) so that developers do not have to worry about the nitty gritty of things like AA, AS, etc...? As long as they program using the DirectX standard, and the video card and driver supports this DirectX standard, should it all not come out hunky dory?

It's not that simple. Some techniques such as deferred shading are not compatible with the hardware accelerated MSAA specified by DirectX.
 
TBH, I play at 1920x1200 and almost never visually notice whether AA is turned on or off unless I take a screen shot and look closely. Even at 1680x1050 the difference was minimal. AA hasn't been a selling point for me for years, particularly considering the performance hit it brings in high-end games. Given a choice between higher AA and higher resolution, I'll take the resolution any day of the week on any game.

There will come a point in monitor resolutions where the pixels will be so small that the human eye simply won't be able to detect the jaggedness of the edges that AA is supposed to correct. We're not there yet and we're not likely to be soon, but I don't think it's all that far off either.
 
It's not that simple. Some techniques such as deferred shading are not compatible with the hardware accelerated MSAA specified by DirectX.

Correction, it is trickier to do in DX 9.0c, but still possible within the DX9.0c spec. That said, it is completely supported and straight forward with DX10+, which is why I can't wait for the UE2.5/3 engine to be abandoned, because it is just too old and outdated.
 
thats a bunch of horsecrap on their part. they claim it will give an unfair advantage in multiplayer..(which blows anyways)....

jesus thats weak. ut2k4 you could adjust online fov from 80 up to 130.

i played at 97 as it was the perfect mix for me
 
TBH, I play at 1920x1200 and almost never visually notice whether AA is turned on or off unless I take a screen shot and look closely. Even at 1680x1050 the difference was minimal. AA hasn't been a selling point for me for years, particularly considering the performance hit it brings in high-end games. Given a choice between higher AA and higher resolution, I'll take the resolution any day of the week on any game.

There will come a point in monitor resolutions where the pixels will be so small that the human eye simply won't be able to detect the jaggedness of the edges that AA is supposed to correct. We're not there yet and we're not likely to be soon, but I don't think it's all that far off either.

No we aren't. DPI has only increased very minimally over the years. The pixel size difference between 2560x1600 30" and 1280x1024 19" is pretty small. I'm gaming on a 22" 1680x1050 and jaggies are *very* noticeable. I've gamed on 1920x1200 15.4" and jaggies were still noticeable.

I'd guess that for most people a 24" display would need roughly 3-4x the resolution to make AA have a minimal impact. For that to happen we would need double the bandwidth of DVI-D (give or take for 60hz) and Windows would need to finally respect the monitor's DPI automatically and for *everything*, not just text. I highly, highly doubt we will see such a display within 10 years.
 
No we aren't. DPI has only increased very minimally over the years. The pixel size difference between 2560x1600 30" and 1280x1024 19" is pretty small. I'm gaming on a 22" 1680x1050 and jaggies are *very* noticeable. I've gamed on 1920x1200 15.4" and jaggies were still noticeable.

I'd guess that for most people a 24" display would need roughly 3-4x the resolution to make AA have a minimal impact. For that to happen we would need double the bandwidth of DVI-D (give or take for 60hz) and Windows would need to finally respect the monitor's DPI automatically and for *everything*, not just text. I highly, highly doubt we will see such a display within 10 years.

The degree to which you notice jaggies depends a lot on the game you are playing. I'm playing on 2560x1600 and still notice them in some games more and less in others. Not a game breaker but it's a little annoying, true.
 
Would have been nice to see CPU utilization differences with different CPUs - preferably with an AMD CPU thrown in there.

I'm wondering, a little bit, why I'd buy this game instead of just replaying the original.....
 
don't forget viewing distance! with bigger and bigger monitors, people tend to sit farther away, thus making the pixels look smaller.

also some people just don't have that good of eyesight, and even at 2ft might not be able to making out each pixel clearly. I personally sit at around 2.5 feet...
 
I just replaced my 4:3 CRT with a 16:10 22" LCD last night. I gotta say the FOV doesn't bother me. I can see what I need too I suppose?

I played Bioshock fullscreen on a 1680:1050 resolution screen. I couldn't play the game for more than 5 minutes without feeling motion sickness. It was actually quite painful to play the game until the v1.1 FOV switch was implemented... and even then, I still had waves of nausea from the touchy nature of the mouse.

I had to scour a lot of how-tos for editing the ini files in order to scale back the mouse responsiveness. Even then... yeah. I'd play trying to keep my arm as stiff as possible so I could limit the range of motion with how I handled the mouse. Mouse sensitivity within the game did nothing for this issue because the problem stems from the ingame mouse acceleration handling. I expect a consistent line, a 1:1 correlation between movement on the desk to movement on the screen. The problem with Bioshock was that the more/faster you moved the mouse... the more extreme the movement was on the screen.

Bioshock is a horribly written game when it comes to view and controls. Shame, because it's great in all other aspects.
 
For that to happen we would need double the bandwidth of DVI-D

Doubtful. While I'm not disputing your point about DPI (though the reply that noted the effect of greater optimum view distance for larger monitors is spot-on), HDMI 1.4 has more than enough bandwidth to support 4096x2160, and that would easily do it on anything smaller than a 30" monitor. Of course, for people with sharper eyesight or who sit closer to their monitors, the requirements will be greater.

DVI will go away. Not entirely--after all, there are still plenty of monitors and cards that support the nearly-obsolete D-sub interface--but effectively. HDMI is simply a superior interface for carrying a multimedia signal in nearly every respect, and it's becoming more and more commonplace. As resolutions increase further and the need for more video bandwidth grows, rather than improving DVI the industry will just shift to HDMI, whatever its current version/incarnation. It's already happening.
 
The FOV thing is just ridiculous. Just playing the game normally, you can't help but notice that you just can't see enough. They spent all of this time creating atmosphere, but because of the FOV, the game feels like it's always in a tunnel. When you swap it to 90 or more, it's almost like playing a different game. Rooms feel bigger, and it feels like you're in an open world instead of a tunnel.

I honestly don't mind the look of the game, though. It's inconsistent (some areas do look really dated), but overall I think the game still looks good. I fall into the group that would rather have huge framerates on a pretty good looking game than 30-40 fps with "next-gen" graphics.

Did they phone-in Bioshock 2? To some degree...yes. There really isn't any innovation involved. It feels almost like the Half-Life Opposing Force/Blue Shift games where you're just playing through the story from another side. Yes, I like the plot and characters, but it never feels like a sequel as much as a big expansion pack.
 
jesus thats weak. ut2k4 you could adjust online fov from 80 up to 130.

i played at 97 as it was the perfect mix for me

that was my point. i had to adjust mine to almost 130 to not feel like i had my face pushed up against the screen, but im playing at 5760x1200.
morons that did COD:MW2 said the exact same thing about why they werent going to make any adjustments to the fov.
 
that was my point. i had to adjust mine to almost 130 to not feel like i had my face pushed up against the screen, but im playing at 5760x1200.
morons that did COD:MW2 said the exact same thing about why they werent going to make any adjustments to the fov.

Whoever thought it was a good idea to hardcode the FOV to a 65degree angle should be shot. That's retch city right there.
 
Hard to believe that Bioshock 1 came out nearly 3 years ago... our 8800's were running it, but not all that greatly. Now? Trivial
 
The two words that best describe Bioshock 2 in my mind are "phoned in".

Now, bear with me for a moment of caveats: I love Bioshock. And so far I have /greatly/ enjoyed Bioshock 2. It has so far delivered at least one memorable jaw-dropping moment--for the benefit of those seeking to avoid spoilers, the part where you first go underwater--and that in and of itself is worthy of praise.

But to be honest: it is painfully obvious that this was a title that was phoned in, using existing assets as a crutch as much as possible. The lack of proper widescreen support is, at this point, an indefensible flaw for which I'll happily download a game rather than support it with my money. And it's not the only flaw from the first game that made its way into the second virtually unchanged--remember that annoying message in the Controls screen that prompts you that you'll lose your changes if you go to set your keyboard controls? Yeah, it's back, and it's identical. A minor annoyance, and it's not the only one, but it's so trivial to fix that for me it sort of embodies the lack of give-a-damn that went into this game.

What really tore it for me though was GFWL. I don't like it, I don't want it, and I resent being forced to use it in order to install the game. Even that I could live with--were it not for the way they inextricably bound they save games to Live. You cannot even save your game without logging into GFWL. Whoever was responsible for that decision needs to never work in this industry again. And the QuickSave seems to be very buggy as a result, in that it seems to only record the first QuickSave made after each hard save.

The binding of the savegame functionality to Live was the last straw for me. I think BS2 is a pretty good game, and I'm not sorry I played it--but I am extremely sorry I paid money for it. Going forward I will not be paying for any games bound to GFWL--period.

Hey man, I agree with you completely. But just wanted to clarify, there is a way to save offline. With the use of an Windows live offline account, you’ll be able to save/load just fine. Obviously, it doesn't require an internet connection to create or log into one.

1. Launch the game. Open the Windows Live menu.
2. If you are already logged onto a profile (local or online), sign out (Create profile button will only become visible when you aren’t logged into anything)
3. Click on ‘Create new profile’ button.
4. In the dialog box that pops up DO NOT hit continue. Hitting continue will take you to a registration page for an ONLINE profile – that’s not what we want.
5. Once in the ‘Create Gamer Profile’ window, scroll down so that you can see the bottom paragraph.
6. Look for the hyperlinked ‘created a local profile’ link in last paragraph. Clicking this will allow you to create a new offline profile.
 
I can't get this game to run without crashes if I try to force FSAA in DX10. I can play the game crash free with FSAA in DX9.

As far as Windows Live and saves. I run the game in Windows Live offline mode and can save fine. I didn't even have to create another account. When I installed the game and started it, it logged me into Windows Live offline mode so I just left it that way. I'm going to play every Windows Live game from now on in offline mode. I just have no use whatsoever for any Windows Live features.
 
The Windows Live thing doesn't bug me (I have a 360 account that I use) but the complete lack of innovation does.
The game offers almost nothing new at all. The plasmids are 95% the same, the weapons are ported over, the enemies are the same, etc. Is the Big Sister such a big of deal that they made her the only really new thing?
I can live with the game not getting a graphical overhaul, but they added nothing new gameplay-wise either. Sure you can use plasmids and weapons at the same time, but with the push of a single button you could do that in the first game, too. It offered a sub-menu to make for even bigger/better combos since you could swap ammo and plasmids in a pause menu, too.

I like the game, but it's probably the least innovative sequel of all time.
 
don't forget viewing distance! with bigger and bigger monitors, people tend to sit farther away, thus making the pixels look smaller.

also some people just don't have that good of eyesight, and even at 2ft might not be able to making out each pixel clearly. I personally sit at around 2.5 feet...

Yes, but they still aren't sitting clear across the room. Even with a large monitor, you still sit relatively close compared to like a TV. And a higher larger res screen that you sit further from really isn't any different from a higher res screen that you sit the same distance from. The effective size you see the pixels at is going to be similar between the two scenarios - and unless you sit like 10 feet back from a 30in monitor you will likely still see jaggies no problem.

Of course it is going to vary based upon the game, but something like a chase cam in a racing game makes jaggies on the car stand out like crazy.

Doubtful. While I'm not disputing your point about DPI (though the reply that noted the effect of greater optimum view distance for larger monitors is spot-on), HDMI 1.4 has more than enough bandwidth to support 4096x2160, and that would easily do it on anything smaller than a 30" monitor. Of course, for people with sharper eyesight or who sit closer to their monitors, the requirements will be greater.

DVI will go away. Not entirely--after all, there are still plenty of monitors and cards that support the nearly-obsolete D-sub interface--but effectively. HDMI is simply a superior interface for carrying a multimedia signal in nearly every respect, and it's becoming more and more commonplace. As resolutions increase further and the need for more video bandwidth grows, rather than improving DVI the industry will just shift to HDMI, whatever its current version/incarnation. It's already happening.

HDMI 1.4 supports 4096x2160 at 24hz, not 60hz. HDMI 1.4 also didn't increase the amount of bandwidth over 1.3.
 
Hard to believe that Bioshock 1 came out nearly 3 years ago... our 8800's were running it, but not all that greatly. Now? Trivial

My 8800GTX runs Bioshock 1 and 2 wonderfully, in both DX9 mode with AA and DX10 mode at 1920x1200 with maxed settings, 16x AF.

Even a GTX 285 today still has a hard time running it in DX10 mode with forced 4x AA. ATI cards cannot run AA at all in DX10 mode, which sucks for my 4870 1GB.
 
The FOV thing is just ridiculous. Just playing the game normally, you can't help but notice that you just can't see enough. They spent all of this time creating atmosphere, but because of the FOV, the game feels like it's always in a tunnel. When you swap it to 90 or more, it's almost like playing a different game. Rooms feel bigger, and it feels like you're in an open world instead of a tunnel.

Do you really think a gigantic waterproof helmet on would give you any view other than tunnel-vision?

Not saying it's a great choice, but you have to remember that you're a big daddy looking out of a circular hole.
 
Back
Top