Oblivion running @ 1920x1200, thoughts?

Xeero

Gawd
Joined
Oct 30, 2003
Messages
958
as the thread title states, i'm wondering if anyone runs Oblivion at 1920x1200. if so, what frame rates are you getting? is it bringing your beast of a system to its knees?

Me: SLI 7800 GT
Resolution: 1920x1080 (not a true computer LCD)
Settings: everything turned up, no HDR, bloom on, full AF in Nvidia driver settings, no AA
FPS: dear god.... lets just say, it hurts to see my SLI 7800GT to go from V12 status to inline 4.

what is everyone else getting at those resolutions?

note: even though i see most people running traditional aspect ratio's, there are many users who bought the Dell 24in, so i'm sure some people can comment on running at that resolution
 
You must be kidding...I don't even think SLI 7900GTX/CF X1900XTX would give you acceptable framerates. :eek:
 
Bona Fide said:
You must be kidding...I don't even think SLI 7900GTX/CF X1900XTX would give you acceptable framerates. :eek:

:( , if the rumors of nvidia and ati not releasing faster cards until Windows Vista are true, does that mean i'm stuck w/ crappy frame rates? even if i do upgrade to the fastest cards possible?
 
Opteron [email protected], 2GBs of RAM and an x1900xt. Everything maxed, the game is somewhat choppy. Dips into the 20s in the wilderness with lots of grass.

For what its worth, there's almost not noticable FPS difference between 1280x1024 and 1920x1200.
 
Oooska said:
Opteron [email protected], 2GBs of RAM and an x1900xt. Everything maxed, the game is somewhat choppy. Dips into the 20s in the wilderness with lots of grass.

For what its worth, there's almost not noticable FPS difference between 1280x1024 and 1920x768.

1280x1024 compared to 1920x1200? hm i dont know seems like there would be a huge fps difference there
 
Bona Fide said:
You must be kidding...I don't even think SLI 7900GTX/CF X1900XTX would give you acceptable framerates. :eek:

i'm sure quad sli will give acceptable frame rates, too bad nvidia has stated they wouldn't allow DIY'ers to build it

thought of spending money to do evga's step up program just to play Oblivion, dunno if its worth it.
 
I get 35-ish fps outside at 1920x1200 HDR NO AF NO AA and I dont bother looking at fps inside but its smooth
 
Digital Viper-X- said:
I get 35-ish fps outside at 1920x1200 HDR NO AF NO AA and I dont bother looking at fps inside but its smooth

would SLI 7800GT be comparable to a single x1900?
 
I play at 1920x1200 with 4x AA, HDR, and 8x HQ AF. Even edited the oblivion.ini for ugridstoload=9, water reflections on, and have some higher res. texture mods running. My game system is a FX-60, 2GB of memory, and two X1800 XTs. Played a stealth character who used his marksman skill quite a bit and frame rate never affected my aim. Outdoor areas with heavy foliage, though, probably brought the frame rate down to mid or low 20s.
 
John Reynolds said:
I play at 1920x1200 with 4x AA, HDR, and 8x HQ AF. Even edited the oblivion.ini for ugridstoload=9, water reflections on, and have some higher res. texture mods running. My game system is a FX-60, 2GB of memory, and two X1800 XTs. Played a stealth character who used his marksman skill quite a bit and frame rate never affected my aim. Outdoor areas with heavy foliage, though, probably brought the frame rate down to mid or low 20s.

damn... seems like this game sets the benchmark for video card performance

note: also a good question to ask yourself: is the game that demanding? or is it just that horribly coded? remember the game was developed for the 360 and pc at the same time.
if carmack was the lead programmer, could the game possibly run better?
 
Xeero said:
damn... seems like this game sets the benchmark for video card performance

cpu performance too!
its a demanding game, I'd say 7800GT's are about the same as a single X1900XTX a bit better sometimes a bit worse? I dunno for sure
 
Bona Fide said:
You must be kidding...I don't even think SLI 7900GTX/CF X1900XTX would give you acceptable framerates. :eek:

I can assure you, 1920x1200 is doable with CF'd X1900's. I get very good frames (30's), with everything turned up, with AA/AF on as well. You do not need 60+ fps like in a shooter, its not that type of game.
 
Digital Viper-X- said:
cpu performance too!
its a demanding game, I'd say 7800GT's are about the same as a single X1900XTX a bit better sometimes a bit worse? I dunno for sure

Well CPU performance is rarely a defining factor in games. And even though the Bethesda people said it would be optimized for dual-core, I haven't seen anything regarding that. Hopefully a patch will allow for some multithreading.
 
Xeero said:
note: also a good question to ask yourself: is the game that demanding? or is it just that horribly coded? remember the game was developed for the 360 and pc at the same time.
if carmack was the lead programmer, could the game possibly run better?

Well, the Carmack question isn't very relevent since he's never coded an engine with the intentional of it rendering large outdoor areas. Sure, some of his engines have been extensively modded for such usage, but that's another issue.

Anyways, no doubt the engine could run better since Bethesda have never been technical wizards. Look at how broken Morrowind's renderer was in pretty fundamental ways. But in all fairness how well would Far Cry, a pretty well coded game, run at 1920x1200 with the bandwidth hit of 4x AA and FP16 HDR with more advanced physics and AI code consumming clock cycles?
 
John Reynolds said:
Anyways, no doubt the engine could run better since Bethesda have never been technical wizards. Look at how broken Morrowind's renderer was in pretty fundamental ways. But in all fairness how well would Far Cry, a pretty well coded game, run at 1920x1200 with the bandwidth hit of 4x AA and FP16 HDR with more advanced physics and AI code consumming clock cycles?

well i've ran Far Cry at 1920x1200 with all the in game settings turned up w/ no AA/AF, no HDR. it is impressively smooth. this was on a 6800gt btw.

well... at least i remembered it was smooth. its been awhile :confused:
 
Digital Viper-X- said:
cpu performance too!

well when you the game at that kind of resolution, i believe it becomes entirely gpu dependant

on a side note, i'd wish they woulda programmed Oblivion on the source engine. I'm sure it wil look almost as good, but run a hell lot better
 
Bona Fide said:
Actually, I think only a dual 7900/X1900 setup could handle it :(

If someone wants to give me an LCD that runs such resolutions I will give it a try and let you know. :D
 
I find 1920x1200 /w HDR on, 8x AF to be playable (30+fps) on a single 7900gtx, so long as grass is turned off. Turning grass on literally halves the performance. I don't understand why they couldn't implement a simpler version of grass close to Far Cry's level. The swaying patches look a bit hokey anyway.
 
Aix said:
I find 1920x1200 /w HDR on, 8x AF to be playable (30+fps) on a single 7900gtx, so long as grass is turned off. Turning grass on literally halves the performance. I don't understand why they couldn't implement a simpler version of grass close to Far Cry's level. The swaying patches look a bit hokey anyway.

Yeah the grass shaders really pound your system. I was stuck in the teens until I turned that off, and now I'm in the low 30s. Framerates, I mean :p
 
On my box the game is very playable at 1600x1200 with HDR on and though I have never tried 1920x1080, it should be playable on my box. I just don't have an LCD that can do that.
 
I run at 1920x1200

Indoors fps is around 80-120, outside it will dip as low as 18 (oblivion portals around lots of grass, enemies)

It is still very much playable though

There is almost 0 difference between the FPS at 1680x1050 so that indicates the parts that are slow will always be slow no matter what GPU you use (as in the game is CPU limited in many parts)
 
I am running at 1920x1200 with system in sig, most everything set to max, I get 15-25 fps outside, 40 to 70 inside.. in combat it drops to low for my taste, I just ordered a xfx 7900gt extreme edition to replace my 7800gt, going to volt mod and oc that one as well.. should help out a bit.
 
Sig in rig, basics = FX-60 and CF-X1900XTX.

No Chuck patch, so no AFR, will either get that patch working tonight to test with AFR and AA, or will just move to 6.4 to test with AFR and no AA.

Settings right now:

1920x1200 HDR, 16xHQAF. Most sliders at default from initial quality detection. Grass maxed though.

20 to 40 fps outdoors (depending on amount of grass on screen).
35 to 60 in towns.
50 to 60 indoors (50 only when in very large rooms. 60 is limit because of Vertical Sync, and it spends 90% of the time pegged at 60, so I am sure it goes much higher).
 
Xeero said:
as the thread title states, i'm wondering if anyone runs Oblivion at 1920x1200. if so, what frame rates are you getting? is it bringing your beast of a system to its knees?

Me: SLI 7800 GT
Resolution: 1920x1080 (not a true computer LCD)
Settings: everything turned up, no HDR, bloom on, full AF in Nvidia driver settings, no AA
FPS: dear god.... lets just say, it hurts to see my SLI 7800GT to go from V12 status to inline 4.

what is everyone else getting at those resolutions?

note: even though i see most people running traditional aspect ratio's, there are many users who bought the Dell 24in, so i'm sure some people can comment on running at that resolution

x1900 xt @ 711/792 here, i run at 1800x1350, can do hdr + 8x af fine, fps drops in outdoor battles though, 1600x1200 is mostly smooth.
 
Reinstalled windows today, installed cats 6.4s, defragged etc...
i run at 1920x1200 and outside i get lows of 20fps and highs of 50fps, mostly on the 40 region though, 6.3s seemed to be terrible on my rig on a lot of games.
 
I think crosffire would it with the 2xAA I'd like on top of 1920*1200. :D My single XT is a bit sluggish for me at 1680*1050. A crossfire board doesn't cost much, but I am supposed to be waiting for next gen. :| Anyways the best cards for this game are the x1900's. AA + HDR and 16xAF HQ is the way it's meant to be played.
 
X1900XTX
4400+ X2 @ 2.5Ghz
2Gb Rame
playing @ 1920X1080 with everything on .. i get ~30FPS outside and 40-60inside (vsync is enabled)
 
I play comfortable at 1920x1200...a bit choppy at times, but honestly it is at lower rez as well. I have settings @ medium all around, textures at medium but I do have all long distance views on and 2x AA no hdr/bloom.

RIG: fx-57, 7800 gtx 512, 2 gigs some sort of cheap ram on a dell 2405
 
Bona Fide said:
You must be kidding...I don't even think SLI 7900GTX/CF X1900XTX would give you acceptable framerates. :eek:


I don't see why it wouldn't run fine... It can play 1776x1000 on my HDTV with my poor X800XT PE.
 
Devnull said:
I don't see why it wouldn't run fine... It can play 1776x1000 on my HDTV with my poor X800XT PE.

i believe he means w/ all the gfx turned up
 
Well I guess you could try running it at 960x600, at that res a modern 1900x1200 monitor should scale it almost perfectly.
Worth a shot I guess.
 
arentol said:
Sig in rig, basics = FX-60 and CF-X1900XTX.

No Chuck patch, so no AFR, will either get that patch working tonight to test with AFR and AA, or will just move to 6.4 to test with AFR and no AA.

Settings right now:

1920x1200 HDR, 16xHQAF. Most sliders at default from initial quality detection. Grass maxed though.

20 to 40 fps outdoors (depending on amount of grass on screen).
35 to 60 in towns.
50 to 60 indoors (50 only when in very large rooms. 60 is limit because of Vertical Sync, and it spends 90% of the time pegged at 60, so I am sure it goes much higher).

After many attempts Chucks patch doesn't work in any way on my PC, so I installed 6.4.

I now get:

26 to 50 outdoors.
42 to 60 in towns.
54 to 60 indoors.

So the AFR seems to have made a significant difference.
 
hey IMO i cannot belive they would release a game so hard to run.. this completly threw sli and cross fire users systems in the dirt..i mean really its not fair.. even the most expensive system chokes on this game.. x1900xt avgs 37 fps @ 1024/768 4xaa 8x af FOLIAGE areas.. sad..this means that midstream gamers cannot play this game at decent settings! i play this game now at 1182/864 2xaa 8xaf and yea the fps deffinetly dipps i run no shadows :)
 
burned-ati said:
hey IMO i cannot belive they would release a game so hard to run.. this completly threw sli and cross fire users systems in the dirt..i mean really its not fair.. even the most expensive system chokes on this game.. x1900xt avgs 37 fps @ 1024/768 4xaa 8x af FOLIAGE areas.. sad..this means that midstream gamers cannot play this game at decent settings! i play this game now at 1182/864 2xaa 8xaf and yea the fps deffinetly dipps i run no shadows :)


Well one, I don't believe it throws SLI and crossfire in the dirt because it runs great in high res on my mid range system and two big new games ALWAYS push existing video cards to their max. I still have an X800XT PE and I'm able to use shadows.
 
It doesn't make a whole lot of sense. the game must just be buggy with certain configurations. I run 1920x1200 2aa, bloom, 16af, lod, all sliders to the right with my single 7900gtx, 4400x2, asus a8n (non sli board), 2gb's of ocz. At first I had a lot of choppiness and frame rates in the low teens. Even when I put the res at lower levels, the same thing was happening..fps in the teens. However, after i did the ithreads .ini entry changes and frames ahead render change, I run fine. 25-50 outdoors while fighting and spamming nukes.
 
I havn't played around with the settings very much but so far what I've found is HDR wrecks my frame rates down to 10-15. Just turning off HDR and my fps goes up to 35-45. I'm running at 1920x1200 btw with all other settings maxed. No AA or iso. I find I dont really need them at 1920x1200 because there really isn't any jagged edges at that resolution. I do like to play around with them just to see the picture quality difference on occasion. Generally though it is pretty slim if any difference at that res. I'm used to play fps games as well in which case it helps to see every pixel as it is not a blured out version of a pixel because as with most fps aiming is down to a pixels accuracy and you really want to see every pixel for what it is if 1 pixel is all you have to represent an opponent a klick or two away and your trying to snipe them. Anyway, thats my two cents. I came here looking for the difference between HDR and AA because I am curious which produces a better picture if either.

My system...
amd x2 4800+, 2GB ddr 400mhz, 7800 gtx, and all the other goodies... btw its a laptop so it does have an lcd, actually its a WUXGA capable of 1920x1200 max, and that resolution produces the most friggin beautiful picture I've ever laid eyes on by a computer. However, I use a viewsonic g-series that'll give me 85hz on 1920x1200, which is great for fps. The WUXGA is a great lcd but I'm still not satisfied with its reaction time and refresh performance. I can see a different between my crt and the lcd in fast paced fps games. When you have a half a second to kill someone standing behind you and you have to spin 180 degrees and aim at their head the difference between 62hz and 85hz is pretty big. Not to mention the ghosting effect lcd's produce. However, 99% of the time the WUXGA performs awesome outragiously. Anyway, back to play with oblivion graphics... I'd love to hear the opinion of other people running at 1920x1200 or better on any games when it comes to AA, anisotropic, and this HDR stuff (which is new to me). Personally I find it to be pointless when I have to position my eyes 4 inches from the screen and squint to see the pixels anyway.
 
everyone is so obsessed with high resolutions. I still have a 17" monitor and am just planning on running at 1152x864. hopefully I can max most stuff. I'm going to have an Opteron 165, 4x512MB of RAM and an X1900XT.
 
Back
Top