FPS in consoles?

WhiteGuardian

Limp Gawd
Joined
Feb 15, 2008
Messages
421
I never really thought about this, but I always focus on what FPS I get when I play games on the computer. But say if I were to buy Xbox 360 or PS3 and get CoD4 and play it on a TV, what FPS would I get? I can't really imagine them getting the same performance as a PC because the PC is so much more powerful, but I never hear anyone complain about lag in console gaming except like internet connections.
 
In my opinion, consoles are TERRIBLE for FPS.

Everytime someone plays COD4 on a console I think of the one kid at my school who said if it were me vs him and he had a controller and I had a mouse/keyboard he could beat me in COD4.

You have a clear advantage if you have a mouse/keyboard and I can't stand playing FPS with a controller but some people like it.
 
did you read the original post?

hes talking about frames per second, not first person shooter
 
But would that mean the graphics on the computer are superior? Because if you think about it, the graphics card in the PC is is much more powerful than the 360 and alot of them can't even lock at 60 on CoD4.
 
Yea, anywhere usually between 30 and 60 FPS. Most console games I've noticed running slower than 60FPS though.
 
computers are superior consoles are usually ahead when first released but by years end computers are already more advanced in every way possible
 
well for computer gaming to be superior, you need to update graphics card every year or so to run the new games. computer gaming is good but it can be costly. especiialy if you want to game crysis with anti aliasing.
 
Consoles need to meet minimum fps standards at 720p to be released on xbox360. Halo3 however, broke that rule, but apparently MS can make an exception for important games. However many slip by with stuttering during high-action scenes. Graphics quality is more or less locked on the console in order to meet these performance requirements. Since the hardware is standardized they pick a level of graphics quality and spend all their budgeted time for optimization on that one hardware setup. It either meets standards or it doesn't, there is no higher-quality w/lower framerate option.

PC versions often allow you to access better graphics, you'll get what you pay for in this respect.
 
Consoles need to meet minimum fps standards at 720p to be released on xbox360. Halo3 however, broke that rule, but apparently MS can make an exception for important games. However many slip by with stuttering during high-action scenes. Graphics quality is more or less locked on the console in order to meet these performance requirements. Since the hardware is standardized they pick a level of graphics quality and spend all their budgeted time for optimization on that one hardware setup. It either meets standards or it doesn't, there is no higher-quality w/lower framerate option.

PC versions often allow you to access better graphics, you'll get what you pay for in this respect.

Yeah check out the graphics quality of COD 4 on the 360. Horrible. But a fairly steady 60fps. COD 3 stuttered horribly but resolution looked a bit sharper.

I play on the PC with my e3110 and 8800GT at 1920x1200 with medium to high graphics settings and get 60fps steady EASILY. However, I keep the fps locked at 125fps. It does on occasion dip to 100 to 105 fps. However, the graphics quality is insanely good. Then I'll pop in my cod 4 game on the 360 and try to play. It takes me awhile to adjust to the poor graphics quality. Playing on the same monitor with the same resolution chosen in the console settings. Defintely the game and no my monitor (dell 2407wfp a04)
 
Most console games run at 30 FPS. Gears of War is an example of this, though it sometimes drops down to around 20 FPS in some of the more intense situations.

Some, like Burnout: Paradise and Call of Duty 4: Modern Warfare, run mostly at 60 FPS. In situations where many things are on screen in Call of Duty 4, such as several characters and smoke grenades, the FPS will drop to maybe 30 or 40.
 
I was playing COD4 on a friends PS3 the other day with 4-way split screen on one HDTV, and during mortar strikes the FPS would drop so much that it would stutter...I was surprised it dipped into the unplayable region of <30 fps. Hard to say what it runs normally unless the developer tells you.
 
I don't believe for a second that all console games are running 30fps+

I played the 360 when a friend brought it to the last LAN party and the stuntman 2 demo couldn't have been more than 10fps in places. Frame rate is a massive issue on consoles, I dislike consoles for a lot of reasons that are in my mind, forgiveable, however things like unplayable frame rates are not forgiveable.
 
Flatout: Ultimate Carnage demo is 20fps at best. I think more like 15fps. Hard to play any racing game at anything less than 60fps, imo. steering is unresponsive at any less. I never tried the full version of the game simply because the demo played and looked horrible even though I was excited about a version of Flatout 2 coming to the 360. The original Flatout 2, which is what UC is plus the add-ons, was only on the original xbox (i think) and ran horribly there too :p
 
well for computer gaming to be superior, you need to update graphics card every year or so to run the new games. computer gaming is good but it can be costly. especiialy if you want to game crysis with anti aliasing.

Actually its not that costly anymore. Plus you only need to upgrade once every 5 years to beat the consoles, since consoles only refresh every 5~ years.
 
Wish I could get a framerate counter on Mass Effect. Slows down irritated me and I was only playing at 720p. Had to be between 10-20 FPS half the time. And the LOD was soo... slow...

Assassin's Creed for PS3 could've been scaled back a little more too. I'd be upset if it were anything more than a rental. Stutters and sub 20 FPS there.
 
I think most of the time its all in your head, because if your always in the habit of turning on the gauge to see what the current Frame rate is. You physically cannot tell the difference in frame rate between 100 and 45. So if your playing at 100fps and look at the the numbers and it says 45fps, you would freak out, and you think to yourself that you can see it slowing down, but you can't.
As for gaming on a console its pretty amazing, with the 360 just through in the disk and your going to have a great experience (depends on the game) where a computer you have to install it, optimize your settings then you can play, Simplicity is what consumers are wanting now, the one major thing I see as a downfall with PC is that your more likely than not have to buy the game to get to play the online features, a console you can go to a local video store and spend $6 and have 5 days of online play and single player action.
you can flame all you want, but I have been into personal computer since the late 80's, and I am a computer science engineering technology major.
 
I think most of the time its all in your head, because if your always in the habit of turning on the gauge to see what the current Frame rate is. You physically cannot tell the difference in frame rate between 100 and 45. So if your playing at 100fps and look at the the numbers and it says 45fps, you would freak out, and you think to yourself that you can see it slowing down, but you can't.

Yes you can tell the difference, this has been proven multiple times.
http://www.tweakguides.com/Graphics_5.html
 
Yeah..I get annoyed when people tell me I can't tell the difference. Maybe the human eye can't detect a difference in the fps rate. But computers monitors can. And the human eye can detect screen tearing. If vsync on and the refresh rate was at 45, you would see a smooth and steady 45fps and you'd be fairly happy with it. With vsync on at 60hz, you would see 60fps and it would be smooth as hot butter spread on silk. At 75hz with vsync on, it would be 75fps and you wouldn't notice any difference from the 60hz vsync and 75hz vsync (that's what "they" say anyway). However, with a monitor's refresh rate at 60hz (many LCDs), when you have fluctuating FPS, you have different degrees of screen tearing. While the framerate being rendered by your video card is indistinguishable over 60fps, the effeciency in which your monitor displays it is not. In my experience, 130fps or so is where the tearing becomes unnoticeable to me. Around 70 to 90 fps, it's the worst. Anything below 45-50 is just horrid because now you have low fps AND tearing.

So if you want smooth as silk play, you should either play with vsync enabled when possible (and deal with any mouse lag that you might get in the game. Some games are better than others.) or have a super beefy system capable of getting 130fps+.
 
Actually its not that costly anymore. Plus you only need to upgrade once every 5 years to beat the consoles, since consoles only refresh every 5~ years.

Actually it is. They make console games to work with the hardware that the console has, you cant upgrade a console graphics card so they have to work with it. However with PC they make the games and expect you (the user) to buy hardware good enough to run it. So it can get costly if you want to play the newest games on the best settings on a PC.

Correct me if I'm wrong.
 
I was surprised it dipped into the unplayable region of <30 fps. Hard to say what it runs normally unless the developer tells you.

You are surprised that fps dipped below 30 while the console was busy rendering an airstrike on 4 seperate screens during split screen play? You have high standards, but I would feel the same way :cool:
 
Actually it is. They make console games to work with the hardware that the console has, you cant upgrade a console graphics card so they have to work with it. However with PC they make the games and expect you (the user) to buy hardware good enough to run it. So it can get costly if you want to play the newest games on the best settings on a PC.

Correct me if I'm wrong.

You are wrong. You can put together a very capable gaming computer for $500-600. Which is roughly what it originally cost for a PS3. Except that it gives better graphics and you can upgrade it. Shocking.

Obviously if you demand maximum possible graphics it does get expensive, but if you're concerned about maximum graphics then a console isn't even a choice you will consider.
 
You are wrong. You can put together a very capable gaming computer for $500-600. Which is roughly what it originally cost for a PS3. Except that it gives better graphics and you can upgrade it. Shocking.

Obviously if you demand maximum possible graphics it does get expensive, but if you're concerned about maximum graphics then a console isn't even a choice you will consider.

w3rd...I've done some tests. COD 4 on the xbox 360 is about comparable to playing on the PC at 800x600 with all graphics settings turned off or low. How much to build a system that can play at 800x600? Well...cheap.
 
You are wrong. You can put together a very capable gaming computer for $500-600. Which is roughly what it originally cost for a PS3. Except that it gives better graphics and you can upgrade it. Shocking.

Obviously if you demand maximum possible graphics it does get expensive, but if you're concerned about maximum graphics then a console isn't even a choice you will consider.

I doubt that the framerate of that PC would be locked at 60 fps however.
 
w3rd...I've done some tests. COD 4 on the xbox 360 is about comparable to playing on the PC at 800x600 with all graphics settings turned off or low. How much to build a system that can play at 800x600? Well...cheap.

Not actually true though, I've got both versions. CoD 4 on the xbox360 runs at 1280 x 720p upscaled to 1920x1080 using what is considered medium/high settings on the PC version and runs around 60 fps, dipping to 40s under air strikes(hard to judge since you can't see shit through the screen shaking).

It's safe to say that PCs can run games on higher settings, there's no need to exaggerate things. Let the truth speak for itself.
 
You're right, it would probably be higher.


I really doubt that, however feel free to come to your own conclusions about things. I think most people who have see CoD4 on a console and a PC would feel the consoles are holding up their side very well, and its always smooth on a console.
 
You are surprised that fps dipped below 30 while the console was busy rendering an airstrike on 4 seperate screens during split screen play? You have high standards, but I would feel the same way :cool:

Well yea I guess...I am used to PC gaming where framerates under 30 are fairly common till you reduce one aspect of graphic quality. When I think of consoles, maybe I am used to the older ones (PS1, dreamcast, etc) where it was never issue enough to complain about it, you just turned it on, it worked, and the last thing you thought about was, omg what frame rate is this running at.

But, point taken, things are changing in consoles and maybe my standards are a little too high ;) I personally think a console game that dipped into the unplayable region could be considered unpolished.
 
Back
Top