Halo 3 actually runs 640p no matter what.

I don't think its a matter of what resolution that it runs at, with consoles it seems to be a shell game of "which resolution are you really getting". Every PC gamer knows that if you want a better frame rate you lower the resolution. In this case the scalar just helps make up for a lack of power.

Didn't MS state that the minimum resolution for game certification was 1280x720 :D

/still want the "free aa" MS promised
/I know sony plays the resolution game too
 
I don't think its a matter of what resolution that it runs at, with consoles it seems to be a shell game of "which resolution are you really getting". Every PC gamer knows that if you want a better frame rate you lower the resolution. In this case the scalar just helps make up for a lack of power.

Didn't MS state that the minimum resolution for game certification was 1280x720 :D

/still want the "free aa" MS promised
/I know sony plays the resolution game too

I don't remember Microsoft making that statement about resolution, nor could I find any reference to such statement. If you have a link to share, please do so.
 
Microsoft had specified that games must run at the 1280x720 resolution at 4xAA with no slowdown.

Sooooo because Halo 3 has a big name they let it be 640p with no AA. :confused:

I think it should say that on the box. This is slimy.
 
What are you talking about? Your entire post makes no sense. You tell Phide not to talk about things he doesn't know, but in reality it is what you're talking about that makes absolutely no sense. I don't like flame wars either, and this isn't intended to be one, but what Phide said I completely agree with.

Developers don't have to work harder to make "extra pixels." I think your understanding of what a resolution is is fundamentally lacking. Developers might spend extra time into making better textures if they think higher resolutions demand it for better picture quality, but even then texture filtering for the most part can fix this unless the textures are an insanely low-resolution mess. And models don't have to be upgraded at all. A model with a relatively low amount of triangles is actually going to look much better at higher resolutions because aliasing will start to be less and less noticeable. Halo 2 upscaled to 1280x720 using the Xbox 360's emulator didn't make any modifications to the actual game, nor did it take insanely long to do.

And what about PC games? Crysis will be able to run at resolutions quite a deal greater than 1080p. They have been spending a long time on it, but by your logic they would need to spend years and years on the game in order for it to specifically display at 2560x1600, which is something you can get Tribes to do - a game released in 1998 that took less than a year to develop.

On another note, Bungie's response was pretty hilarious, but I will agree that the difference between 720p and 1152x640 is almost nil.

Im sorry if I can not explain it exactly so you can get what I am talking about (not taking a pot shot at you, I am serious). I am not talking about upscaling, I am not talking about non-native resolutions. I am talking about the actual native resolution that the game is running at for textures and such. NOT the resolution that it is being displayed at.

I wish I could send you the link but I cannot, because it is from a electronics show on TV where they were interviewing some of the devs from a few of the larger game studios on the difference between PS3 and 360 game creation. They were talking over the subject of why games are not made natively at 1080p and why most are created at 720p maximum and then upscaled. It is because the detail difference is just not there to justify spending all the time creating more detailed textures that take more disc space, etc.

I know you can play a game from 1998 at huge resolutions...I also know it is upscaled which is NOT what I am talking about and is what you are talking about right now. Games are upscaled, have been for a long time....

I probably didnt explain it exactly the same way they did and it probably did not come out perfectly, sorry and I do wish I could show you the video I am talking about so it could be explained easily, properly, and from authority. I am throwing down my tin hat and leaving :p
 
http://www.hardocp.com/article.html?art=Nzcx

Same stuff repeated on [H], notice that they say HiDef Minimum and then link to an article that says 720p is the minimum spec for HiDef.

http://www.atd.net/HDTV_faq.html

And scaling doesn't count IMO (sure I can scale a DVD and it looks good, but native resolution looks better).

in the end if none of this detracts from the quality or the fun factor of games who cares, it just under handed, sort of like the fuel economy stickers on cars, your mileage may vary :D

Just too much bickering over ePeen
 
I know you can play a game from 1998 at huge resolutions...I also know it is upscaled which is NOT what I am talking about and is what you are talking about right now. Games are upscaled, have been for a long time....

Sorry but this is just not true. A game from 1998 that is run on a computer at 1920x1080 is not rendering at 640x480 in the framebuffer and using an image scaler. It is rendering at 1920x1080.

The "extra pixels" have to do with processing power, not extra workload. Yes, when your target is 1080p you are compelled to create higher quality textures. Does it REQUIRE that you create higher quality textures? No.

Just because your canvas is twice as large, doesn't mean you have to use twice as many colors. Just because your rendered resolution is large, doesn't mean you need to do anything to make the game look better at those higher resolutions.
 
I'd just like to chime in that in the comments section on the page I first read this story on (linked from Joystiq), it also listed a nubmer of games on the PS3, as well as additional titles on the 360 (Call of Duty 3, PDZ, PGR3) that are at lesser native resolutions than advertised. The Darkness, and Ninja Gaiden on PS3 were two such examples (because someone asked on the first page) there was one more, but the name escapes me

The bottom line on this is that if nobody told you, and you weren't a PC gamer screaming about jaggies, you'd still be too busy playing your game of choice to notice.

I could personally care less, as a non-bleeding edge gamer, I've been reducing the resolution on games for years to eke out more performance. That a developer is having to do the same thing on a closed platform console system to keep up the eye candy on a 3rd gen game comes as no surprise to me.

It's been done before, who's surprised about it being done again... and I'm sure this won't be the last time.
 
Sorry but this is just not true. A game from 1998 that is run on a computer at 1920x1080 is not rendering at 640x480 in the framebuffer and using an image scaler. It is rendering at 1920x1080.

The "extra pixels" have to do with processing power, not extra workload. Yes, when your target is 1080p you are compelled to create higher quality textures. Does it REQUIRE that you create higher quality textures? No.

Just because your canvas is twice as large, doesn't mean you have to use twice as many colors. Just because your rendered resolution is large, doesn't mean you need to do anything to make the game look better at those higher resolutions.


I think thats what they were talking about, the fact that it would look the same (aka not any better) unless they put more work into it to as you said "use the full canvas"

I think we are saying the same thing sorta, just I didnt explain it clearly/totally right.

Again sorry, :eek:
 
The higher the res the more pixels in said resolution. Thus the creators have to put more time into making more pixels in their work. This is everything from textures to player models and weapon models.
The resolution is just an output resolution. The GPU transforms, colors, shades and rasterizes pixels to conform to the output resolution, and that's essentially all a GPU does.

The resolution barrier, in terms of content resolution, is dependent on RAM, VRAM (to use the term loosely in this context) and on various qualities of the GPU (fill rate, shader flop rate -- when surfaces are shaded, increasing resolution of the surfaces equates to an increase in the number of pixels to be shaded -- and so on). The lack of resolution is neither a barrier nor a current factor for content generation.

The hardware dictates the possible complexity of art assets and the complexity of shader programs, not the resolution. Developers don't look at a target output resolution and determine what the texture resolution needs to be for a given surface, they're looking at a target frame rate for a given resolution (1280x720 in the case of conforming X360 developers) and determining how complex each asset can be, whether that's a texture, a normal map, a shader, a model or whatever. That's in the case of a console developer, while PC developers generally have a hardware target range and a resolution target range.

It does not take the same amount of time to make them native to the resolutions.
Now I'm not sure if we're on the same page here. You're saying it takes longer to generate content for a game that outputs at 720p than it does to generate content for a game that outputs at "640p"? On a fixed platform, the only difference is that asset creators can be less vigilant about optimization at a lower output resolution. They could go with slightly higher resolution assets and still achieve the same target frame rate, or they could go with lower resolution materials, as it doesn't "matter" as much.

Have you ever made any material assets for a game/mod? Typically, assets are created at fairly high resolutions - four times the resolution of the target resolution is typical practice, and I've seen interviews with game artists who routinely work at eight times the target resolution. Ergo, the typical texture/surface is going to begin its life at a high resolution, and it can be downsampled as desired.

On another note, Bungie's response was pretty hilarious, but I will agree that the difference between 720p and 1152x640 is almost nil.
Yeah, but if it says "720p" on the box, then there's a problem. While it probably doesn't look much different at all, you're telling people that it's 1280x720. Therein lies the issue.

I brought this point up with someone else: could a 480p DVD be labeled as "1080p" if the every DVD player automatically upsampled to 1920x1080? There's no way any studio could even hope to get away with that, and I don't understand why developers are getting away with it either.

I guess we also have "600p" games in our midst too, and on the PS3 as well! And nobody's being told that their $300/$400/$500/$600 720/1080 consoles are rendering games at arbitrary resolutions and doing upscaling that practically any TV on the market can already do (though typically with less stellar results).

I know you can play a game from 1998 at huge resolutions...I also know it is upscaled which is NOT what I am talking about and is what you are talking about right now. Games are upscaled, have been for a long time....
No...not correct at all. The GPU in my machine does not upscale unless it detects that something is running at a non-native resolution, and then only because I've decided that its scaling engine is better than the one in my Cinema Display. A game does not render internally at any resolution. Precision, yes; resolution, no. There's a massive difference between rasterization and scaling. A GPU renders and rasterizes. In other words, it internally transforms, colors, lights ans shades pixels, and it rasterizes this rendering to one or more frame buffers.

When you play Far Cry at 1920x1200, your GPU is arduously undertaking the task of determining the final result of 2,304,000 pixels with a final bit depth of 32 bits per pixel. It's not grabbing a 640x480 image from the frame buffer and just blowing it up.

Games don't upscale. Renderers don't upscale. Upscalers upscale.
 
I'm just glad Tekara posted it so I didn't have to look like a jerk :D
 
So does this clarify to the 360 fanbois that the 360 gpu is no way near as powerful or next gen to the x1900xt!? Im pretty confident an x1900xt would piss this game at true 720p!
 
So does this clarify to the 360 fanbois that the 360 gpu is no way near as powerful or next gen to the x1900xt!? Im pretty confident an x1900xt would piss this game at true 720p!

some "fanbois" look at specs on said GPU's , not state whether or not they are "pretty confident" it can do X game @ X Resolution.
 
So does this clarify to the 360 fanbois that the 360 gpu is no way near as powerful or next gen to the x1900xt!? Im pretty confident an x1900xt would piss this game at true 720p!

Do PC fanboys think that's what 360 fanboys think? :rolleyes: I can say I don't... I'm too busy enjoying playing games, and not worrying what I'm playing them on.
 
Do PC fanboys think that's what 360 fanboys think? :rolleyes: I can say I don't... I'm too busy enjoying playing games, and not worrying what I'm playing them on.

A Lot of 360 fanbois hailed the 360 gpu better than anything else out there when it first came out, i cant be arsed to but proof is on this forum if you can be bothered to backtrack enough, also 360 and ps3 fanbois slated the wii for not being ably to output True HD in games. Now you 360 fanbois sound like the wii fanbois stating the all famous nintendo phrase "its all about the gameplay" 360 fanbois were like this until they discovered Halo 3 was not true HD. Just getting my point across from old threads/arguments/flame wars on this forum! :p Another point is if youre not bothered about the graphics ms might as well have just revised the xbox 1 adding an upscale chip and a bigger hard drive and release halo 3 on it and saved a crap load on cutting out developing 360 , it would probably run it at 480p natively lol. Its a bit like Zelda on the wii, not much different graphics wise than the gamecube version but wii players excuse is the gameplay is better with the wiimote and true widescreen resolution....rightly so (360 owners cant use those excuses though). I love the gameplay on Halo 3, i was on it for 3 hours solid today in a gamestore lol. If im paying £40-£50 on a 360/ps3 game that stats ultimate true hd gaming then it sure as hell better perform at 720p+ and no less. Its like all these rip off Bluray/HD-dvd films re releasing old films that were never filmed natively in hd, upscaling it on the disc and then charging £20-£30 for a 20 year old film that would get upscaled by the dvd/bluray player anyway.
Put it this way, if you have a ps2 and a 360 , a game comes out on both formats (i know this isnt the case with halo 3 but is with lots of other games) and you want this game where textures are practically the same and both have online play but ps2 is 480p and 360 is 720p which format are you going to buy it on?
 
1080 is a screen resolution -- nothing more. The amount of time needed to generate content for a title that renders at 1920x1080 is identical to the amount of time needed to generate content for a title that renders at 640x480, 1280x720, 320x240, or any other resolution imaginable.

I have to respectfully disagree on that. Surely a game running @ 720p / 1080p on the 360 or 1200 on a PC would need to have much more detailed and therefore much larger textures than a game running 480.
These textures would need more time to produce and and space to store. Take a look at BioShock and how detailed the textures are. I run it on the PC @ 1200 and the are sharp as a tack.

Addressing another posters comment. Filtering can only do so much and usually will only enhance the initial quality of the said texture. In other world garbage in / garbage out.

The smaller textures (file size) would only blur out at the higher res setting.

Best

JMD
 
Put it this way, if you have a ps2 and a 360 , a game comes out on both formats (i know this isnt the case with halo 3 but is with lots of other games) and you want this game where textures are practically the same and both have online play but ps2 is 480p and 360 is 720p which format are you going to buy it on?

Well, 360 obviously, because it is 720p vs. 480p... :confused:

However, your scenario is so totally unrealistic I almost cannot believe it. The jump in graphical quality is a lot more than just a resolution increase. PS2 has 4 megabytes of video ram. Xbox 360, on the other hand, has 512 megabytes unified memory, of which, let's say, about 400 megabytes could be devoted to textures. So a game where "textures are practically the same" that is available for PS2 and Xbox 360 is also a game where the developers should either be fired immediately or forced to make cell-phone games for the rest of their lives. Even the crappiest port imaginable can't be that bad.
 
Well, 360 obviously, because it is 720p vs. 480p... :confused:

However, your scenario is so totally unrealistic I almost cannot believe it. The jump in graphical quality is a lot more than just a resolution increase. PS2 has 4 megabytes of video ram. Xbox 360, on the other hand, has 512 megabytes unified memory, of which, let's say, about 400 megabytes could be devoted to textures. So a game where "textures are practically the same" that is available for PS2 and Xbox 360 is also a game where the developers should either be fired immediately or forced to make cell-phone games for the rest of their lives. Even the crappiest port imaginable can't be that bad.

You mean like ubisoft/EA Sports doing crappy ports to the pc and all the majority of console to pc ports? Youd be surprised how much memory Anti Aliasing/increase in resolution takes up. A lot of people should be fired then. I dont think there will be many 360 fps games where 400 meg is dedicated to textures at any one time.
 
PS2 has 4 megabytes of video ram.

The PS2 can utilize the 32mb system memory for video use much like the PS3. I can't quote you specifics, but I'm sure there are some PS2 games that use more than 4mb vram. Maybe GoW2 ?
 
a bunch of nerd on a message board arguing about a topic they have no experience in. this is even dumber than people arguing about the structure of the ps3's spus

as long as the game is fun, does it really even matter?
 
So does this clarify to the 360 fanbois that the 360 gpu is no way near as powerful or next gen to the x1900xt!? Im pretty confident an x1900xt would piss this game at true 720p!

The 360 GPU is plenty powerful, the problem is that it's crippled with a 128-bit memory bus.

Take an 8800GTX and cut 256 bits off the bus to bring it down to 128 bits and tell me it's going to perform well in higher resolutions.
 
a bunch of nerd on a message board arguing about a topic they have no experience in. this is even dumber than people arguing about the structure of the ps3's spus

as long as the game is fun, does it really even matter?

It absolutely matters. I took a break from an awesome 4 person coop game with my good friends, and read this on the internet. I immediately began crying uncontrollably at the missing 80ps! They FOOLED US. Those 80ps retroactively made all the fun I've been having with Halo so very very bitter I wanted to spit the memories from my brain.
 
It absolutely matters. I took a break from an awesome 4 person coop game with my good friends, and read this on the internet. I immediately began crying uncontrollably at the missing 80ps! They FOOLED US. Those 80ps retroactively made all the fun I've been having with Halo so very very bitter I wanted to spit the memories from my brain.


so i take it you actually didn't notice the difference until you read this thread then...
 
so i take it you actually didn't notice the difference until you read this thread then...

Well, I didn't. Still don't. I've played it for hours and hours, and I'm still amazed at some the of the crap they put into the levels; architecture and texturing, objects in the distance, backdrops, etc.

Things like this only matter to people who need/want something to bitch about, or maybe to people who think entertainment software companies are something more meaningful than that... I don't know. I don't understand how it gets to this point.

The rest of us who found out days after we'd already been playing the game and having fun could care less. I'm not a resolution whore, nor in possession of a cutting edge gaming PC. I take whats given to me as long as it's fun and runs decent.

Getting caught up on the serious side of a discussion like this takes too much time away from my free time, and that free time is better used playing the games not discussing their shortfalls.

I guarantee that 100% of the people not aware of this fact aren't enjoying the game any less because of it. :D
 
Well, I didn't. Still don't. I've played it for hours and hours, and I'm still amazed at some the of the crap they put into the levels; architecture and texturing, objects in the distance, backdrops, etc.

Things like this only matter to people who need/want something to bitch about, or maybe to people who think entertainment software companies are something more meaningful than that... I don't know. I don't understand how it gets to this point.

The rest of us who found out days after we'd already been playing the game and having fun could care less. I'm not a resolution whore, nor in possession of a cutting edge gaming PC. I take whats given to me as long as it's fun and runs decent.

Getting caught up on the serious side of a discussion like this takes too much time away from my free time, and that free time is better used playing the games not discussing their shortfalls.

I guarantee that 100% of the people not aware of this fact aren't enjoying the game any less because of it. :D

Seems like if this title would of been released on the PS3 pulling this kinda stunt, people would be outraged and people would start spewing why the X360 is better. Yet when a title is released for the X360, its all about gameplay and all about not discussing its shortfalls. ;)
 
I microwaved my Elite 360 with Halo 3 still in it, used my Premium 360 to see if an Xbox could stop a car, sent Bungie a nasty email, then I smashed my TV for taking part in the deception.

Goddamn I feel satisfied. Fuck it, Halo just wasn't as much fun as it was before I found out about the 80 missing p's.
 
I microwaved my Elite 360 with Halo 3 still in it, used my Premium 360 to see if an Xbox could stop a car, sent Bungie a nasty email, then I smashed my TV for taking part in the deception.

Goddamn I feel satisfied. Fuck it, Halo just wasn't as much fun as it was before I found out about the 80 missing p's.

Good for you, and I would be pissed about the 80 missing p's too. thats like...

pppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppp X (times) 2304 MISSING!!! I would be PPPPPPIIIISSSSEEEDDDDD!! hahahha
 
Good for you, and I would be pissed about the 80 missing p's too. thats like...

pppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppppp MISSING!!! I would be PPPPPPIIIISSSSEEEDDDDD!! hahahha
FFS! For the record its not 80 pixels missing, its 184320! Now put that amount of p on your next post lmao.

1280x720 (720p) = 921600
1152x640 (640p) = 737280

So in effect halo 3 is missing 1/5 of the pixels a true 720p game has. In my book that aint HD gaming. If that was a sony game it would get slated to shit.
 
Quick question here: If you are running on a monitor 1920x1200 and the game is displayed at 720p, for the 360, does it stretch the picture to your full screen, making it look distorted, or does it force your display to 720p resolution? Now, I don't notice a distortion, but maybe I don't look hard enough and maybe I haven't made comparisons.

The reason I ask this is that I have an HD tuner in my PC and if I'm watching something at 720p, I have to decrease my monitor's resolution to watch TV at full screen. By doing this, the picture quality is actually increased. Again, if watching something in say, 1080i, I change resolution to match as closely as I can. If I leave the resolution at 1920x1200, the HD tv picture doesn't look as good. Cuz..it's stretched too much. So, is the 360 the same way...should I change resolution lower to get a better quality picture or does the 360 do this for me automatically?
 
If the 360 cant do the native resolution of your monitor in most cases the picture quality is going to be slightly distorted, there arent many monitors that handle other resolutions like the native and theres a slight decrese in image quality. My samsung 24" handled 1440x900 better than any other non native resolution it tried. If your 360 is plugged directly into your monitor it should be full screen
 
Seriously, who cares? This has gotten ridiculous. I am seriously envisioning all the outraged people as the guy in that South Park episode who cannot be killed in WoW......
 
it is plugged in via VGA to my 1920x1200 monitor. And I can choose that resolution in the XBOX display settings. That's not the issue. I'm asking if the 360 automatically adjusts the resolution to what the game handles, which is 720p resolutions, instead of keeping the resolution at 1920x1200 and stretching the picture to fit that. I understand it wouldn't be the native resolution of the monitor itself.
It's always fullscreen, stretched or not.
 
Back
Top