Japanese devs talk HD on consoles (and its issues, including not really rendering HD)

steviep

Supreme [H]ardness
Joined
Jan 18, 2005
Messages
4,985
Source: http://www.watch.impress.co.jp/game/docs/20060426/3dhd.htm
Translated: http://www.maxconsole.net/?mode=news&newsid=6879

Zenji Nishikawa has published his latest article about 3D gaming technologies. The focus of the article is about sub-HD rendering in the next-generation consoles. The article also contains anonymous developer quotes which have been translated by beyond3d.


quote:

--------------------------------------------------------------------------------


- The RAM bandwidth of Xbox 360 GPU is almost equal to RADEON X1600 XT and shared with CPU by UMA.

- Without the eDRAM pixel processor doing 4xMSAA, the fillrate of the GPU core itself is 4 billion texel/sec and almost equal to GeForce 7600 GT.
While the Xbox 360 has a 3.5 times broader bandwidth than the original Xbox, 720p pixels require a 3 times broader memory bandwidth. It leaves only 0.5 times headroom which is insufficient for multiple texture lookups by complex shaders.

- eDRAM is implemented to mitigate the impact of the low memory bandwidth. But FP10 + 2xMSAA requires Predicated Tiling.
Tile rendering has many performance demerits.

- In games with many chracters like N3 the cost of overlapped geometry grows large unless LOD is implemented.

- Lens effect, refraction, HDR effects such as bloom and glare, and other frame buffer filtering cause overlapped drawing near tile boundaries.
Objects that cross boundaries can't use the cache efficiently.

- CPU L2 cache locking is practically unusable.

- Since textures are stored in the shared 512MB RAM, regardless of the eDRAM size or use of tile rendering, texture lookup consumes the shared memory bandwidth. Normal mapping and shadow mapping require many texture lookups.

- So the last resort is to use Display Controller to upscale the image without using tile rendering, for example rendering FP10-32bit / 960*540 / 2xMSAA / 32bit Z (8MB).


--------------------------------------------------------------------------------



Here are the anonymous developer quotes which have been translated...


quote:

--------------------------------------------------------------------------------


Developer A: Even 2xMSAA is not required by Microsoft anymore.

Developer B: FP10-32bit / 880x720 / 32bit Z / 2xMSAA (9.9MB) rendered to look right when upscaled to 16:9 is also possible.

Developer C: You can render it in a certain low-res then to display it you can create a 720p frame by your own shader. In converting the original low-res frame into a 720p frame by the shader you can do color dithering, which may result in smooth color expression or alleviation of the resolution deficiency in FP10.

Developer D: At any rate I want to reduce jaggies. Since the eDRAM pixel processor is penalty-free upto 4xMSAA, it will be interesting if it's fully exploited. Though it becomes 640x480 with 4xMSAA and FP10-32bit if it's not tile-rendered, aliasing-free images will be totally different from what we have seen in older games.

Developer E: If you think HDR rendering as a premise, PS3 is worse than Xbox 360.

Since PS3 doesn't support FP10-32bit buffer, if FP16-64bit HDR is used it requires twice the bandwidth of Xbox 360 but PS3 doesn't have eDRAM like Xbox 360 to mitigate the impact. It's possible that pseudo-HDR employed in Xbox and DX8 that use a conventional 32bit buffer (8bit int per ARGB) is often used in PS3. Besides the display controller may be used to upscale sub-HD images to a HD resolution.

Developer F: As for resolution I think if it's modest it's OK. Since RSX in the PS3 is a shader monster, adding more information to a pixel by executing ultra-advanced shader and then antialiasing it completely must make it look more real. I'd like to give priority to the reality charged in one pixel rather than to HD resolution.


--------------------------------------------------------------------------------

Interesting... upscaling rather then rendering in HD due to power and memory issues (specifically AA and the complete lack of foresight with that 10mb onboard RAM). So much for the "HD era" if those restrictive guidlines are gone already.
 
Memory is so cheap anymore, it's a shame the 360 doesn't have 1GB of memory.
 
True HD or not, Nintendo will make millions off of the wii, while sony and microsoft will lose millions, if not billions, from the sale of their respective products.
 
If this article is true, then perhaps the XBox 360 wasn't ever meant to render in HD (its 10mb framebuffer isn't enough for full AA) and it's an afterthough, which may haunt the console for the rest of its respective life. Not to mention the inneficient design of the PS3 for various effects. The "HD era" seems bunk.
 
It may not be quite as glorious as MS predicted, but I've switched between SD and HD on my TV with the 360 and there's quite a noticeable difference between the two. I'll take whatever they can give me.

Those comments are very interesting though. I don't really understand things like "tile rendering" and all that, but the overall point that the machine is more limited when it comes to HD than MS probably hoped makes one go "hmmm." Still, GRAW looks amazing (just bought it today) and i can't wait for more games to come out that push the graphics on this thing.
 
Oh, don't get me wrong. It looks great for the most part but they should really tell the truth about it (developers, I mean) if this is the case. If the game is rendered in a different (lower) resolution, like PGR3 is for instance, it should say something like "upscaled" on the box.
 
This is kinda related i think:

On the back of the box of the game, it says "480p, 720p, 1080i" Does that mean 1080i is the game's "native" resolution? Are there/will there be games that say, for example, "480p, 720p?"
 
beanman101283 said:
This is kinda related i think:

On the back of the box of the game, it says "480p, 720p, 1080i" Does that mean 1080i is the game's "native" resolution? Are there/will there be games that say, for example, "480p, 720p?"

if this game was made for a system that could render it? that would make sense. In the case of the xbox360 all games are renderd in 720p (or less) then upscaled. :eek:
 
Lmao , thank you steviep once again for bringing interesting stuff to the forum. It does look like Nintendo as always is the most honest out of the big 3, gaining more and more respect leaving sony and microsoft to quibble and make huge cock ups! Although i will own all 3 next gen consoles. I can justify the price of my x1900xt even more now.

PS do you sleep? I seem to picture you with a 10 screensetup constantly surfing! Nice one anyway! Cheers
 
LOL of course I sleep. But I do get to visit this site a couple times a day. Regardless, I thought this was too interesting to pass up posting, as the original article was quite frank (even if the translation has a bit of Engrish to it). It's interesting that the usual suspects aren't in here defending their purchases, however. I don't want to make this into a PC vs console thread, of course, but you do get what you pay for, and sometimes "HD" isn't it :p
 
steviep said:
If this article is true, then perhaps the XBox 360 wasn't ever meant to render in HD (its 10mb framebuffer isn't enough for full AA) and it's an afterthough, which may haunt the console for the rest of its respective life. Not to mention the inneficient design of the PS3 for various effects. The "HD era" seems bunk.

You got all that from poorly translated, anonymous quotes in a Japanese game mag?

You seem to be making a lot of assumptions saying "HD is an after thought" / "HD Era seems bunk / Complete lack of foresight". Is this fact or your opinion? And you came to those conclusion based on anonymous sources? Or was this your opinion BEFORE you read anyhting and you scour the intarweb looking for items that reinforce your opinion?

From what you are saying, we can safely assume that If the PS3 and Xbox 360 are not doing full HD...they are "poorly designed crap". Correct? God only knows what that means for consoles that have dropped out of the graphics race altogether :rolleyes:


To be honest, the "HD Era" will always seem bunk of you aren't part of it.
 
steviep said:
No problem. Maybe Nintendo has a point, skipping HD this generation and letting the other 2 consoles deal with all this poorly designed crap. lol

Please keep the PM I sent you the other day about baiting people in mind, this post is flamebait pure and simple.

If someone said the Wii was "poorly designed crap" you'd be hitting the REPORT POST BUTTON so fast there'd be smoke coming off my inbox. Please refrain from these types of posts.
 
It's not that I won't be a part of the "HD era" because I already am. In fact, I STILL am disappointed that I'll be stuck with 480p on my DVDs and my Wii (lol I still can't get over that name), because I have the equipment to take advantage of more. It's the fact that developers are not being as forthcoming as they should with software and hardware. That angers me. Why do these developers have to be anonymous? Why can't they just outright admit who they are? Are they afraid that Sony and Microsoft will cut them off or something?

I don't mean to incite "flamebait" as you put it. I didn't "scour the web" looking for this, it was on one of my favourite console modding sites, front page (linked above). These consoles are SUPPOSED to be where I get my HD gaming from to differentiate from my Wii (aside from my PC) and it turns out that they may have made some poor design choices. Either that, or a poor choice of words in their PR strategy, and a lot of wild exaggeration on their power, though Sony usually does that anyway. For that, I am disappointed. There is a civil discussion going on at the Beyond3D forums where some of the translation took place, it would be nice to keep this one just as civil.
 
steviep said:
There is a civil discussion going on at the Beyond3D forums where some of the translation took place, it would be nice to keep this one just as civil.




I did notice that the thread at B3D was missing something our uncivilized one has...all the "poorly designed crap" comments. If we can stay away from all the "poorly designed crap", "HD Era is bunk", "complete lack of foresight" this conversation will stay on track too.
 
I've argued with people many times about the X360's hi-def capabilities (or lack of), and this only confirms what I've been saying.
 
steviep said:
If this article is true, then perhaps the XBox 360 wasn't ever meant to render in HD (its 10mb framebuffer isn't enough for full AA) and it's an afterthough, which may haunt the console for the rest of its respective life. Not to mention the inneficient design of the PS3 for various effects. The "HD era" seems bunk.

Careful. The 10mb buffer is on die and the main framebuffer is the shared 512mb memory. PC gfx cards work in a similar way. Just the memory on pc gfx cards is not shared with the rest of the system. If your comment was correct then PC gfx cards would have NO frame buffer. Think of the 10mb on die cache as the same as a CPU's level 2 cache but for the gfx core. Thats my take on the specs anyway. Could be wrong. Dont take my comment as gospel.
 
What the author of the article is trying to say (never mind the anonymous devs, the article itself has some interesting revelations too) is that the 10mb on-die isn't enough to do AA at HD resolutions.
 
steviep said:
It's not that I won't be a part of the "HD era" because I already am. In fact, I STILL am disappointed that I'll be stuck with 480p on my DVDs and my Wii (lol I still can't get over that name), because I have the equipment to take advantage of more. It's the fact that developers are not being as forthcoming as they should with software and hardware. That angers me. Why do these developers have to be anonymous? Why can't they just outright admit who they are? Are they afraid that Sony and Microsoft will cut them off or something?
/QUOTE]

Yeah i think you pretty much hit the nail on the head there. The problem with info like this is that because it is an unnamed source, its hard to take seriously. But if the source openly admitted who they were, then they could face a shit storm from their employers because of the financial hit they could potentially get from Sony/MS cutting off funding or withdrawing licences. . I could bitch all day about where i work, but i wont because a: I would get fired or even sued, and b: no-one around here would really take care because of the line of work i'm in.

We just have to take this information with a pinch of salt.

I for one think this information as pretty accurate, but a little premature. Since for one the hardware of the 360 is perfectly capable of outputting HD resolutions. While this early on in its lifetime it COULD just be upscaling lower resolutions, though that could be attributed to developers not getting to grips with the hardware yet. If my old Athlon XP 2800+ with 9800pro can handle 1280x780 resolutions at around 40+ fps then sure as hell the 360 can handle it. Even WITH AA+AF.
 
Skirrow said:
steviep said:
It's not that I won't be a part of the "HD era" because I already am. In fact, I STILL am disappointed that I'll be stuck with 480p on my DVDs and my Wii (lol I still can't get over that name), because I have the equipment to take advantage of more. It's the fact that developers are not being as forthcoming as they should with software and hardware. That angers me. Why do these developers have to be anonymous? Why can't they just outright admit who they are? Are they afraid that Sony and Microsoft will cut them off or something?
/QUOTE]

Yeah i think you pretty much hit the nail on the head there. The problem with info like this is that because it is an unnamed source, its hard to take seriously. But if the source openly admitted who they were, then they could face a shit storm from their employers because of the financial hit they could potentially get from Sony/MS cutting off funding or withdrawing licences. . I could bitch all day about where i work, but i wont because a: I would get fired or even sued, and b: no-one around here would really take care because of the line of work i'm in.

We just have to take this information with a pinch of salt.

I for one think this information as pretty accurate, but a little premature. Since for one the hardware of the 360 is perfectly capable of outputting HD resolutions. While this early on in its lifetime it COULD just be upscaling lower resolutions, though that could be attributed to developers not getting to grips with the hardware yet. If my old Athlon XP 2800+ with 9800pro can handle 1280x780 resolutions at around 40+ fps then sure as hell the 360 can handle it. Even WITH AA+AF.

If you're going to go into technicalities, even the Wii can handle HD resolutions... but it takes power to render "next gen" graphics at HD resolutions, power that your old PC and the Wii may not have. While there is still a lot of room for devs to learn how to use the dev tools, there is still a set amount of memory and bandwidth that won't change, including the afformentioned 10mb on-die ED ram. Like the PS2, applying AA to an HD resolution on the XBox will not be easy now, or 4 years from now.
 
Ya know, if the console manufacturers were lying to us about high definition gaming, then why do the graphics look better in 720p on the X360, than they did on my PC running at 1388x768, or 1280x1024? If they were lying about high definition gaming, surely the X360 would be far worse, but it's been far better. COD2 looks better on my X360 than it did on my PC(the PC version) before I sold it to a Hardforums subscriber. Quake 4 looks just as good too, although the frame rates are really bad. What about other games like Gun? It looks far sharper at 720p on the X360 than it did when I had the PC version, which I sold to another Hardforums subscriber. I don't buy this developer talk about how this isn't really HD content, because the proof is seen on my screen

So you are saying that something as gorgeous as PGR3 isn't running in real high def?

What about Condemned? The game looks really good in 720p

MLB 2K6 has bugs, but there is nothing like looking at Griffey Jr. and Randy Johnson in 720p.

What about Madden 06? That game is one of the most gorgeous of the launch games for the X360 in 720p. Nothing like watching Chad and Rudi Johnson in high definition.
 
meatfestival said:
Did you compare the X360 and PC on the same screen?


For Quake 4, COD2, and Gun, yes.

I was especially impressed with COD2 as it looked a LOT better than the PC version. The smoke looked so much smoother and more realistic, as well as the character detail. The frame rates were also smoother and more consistent. I bought the X360 version of these games because the PC versions were running a bit wierd with my dual core cpu. Luckily, I was able to sell all 3 games really quickly in the sell/buy section of this forum.
 
junehhan said:
Ya know, if the console manufacturers were lying to us about high definition gaming, then why do the graphics look better in 720p on the X360, than they did on my PC running at 1388x768, or 1280x1024? If they were lying about high definition gaming, surely the X360 would be far worse, but it's been far better. COD2 looks better on my X360 than it did on my PC(the PC version) before I sold it to a Hardforums subscriber. Quake 4 looks just as good too, although the frame rates are really bad. What about other games like Gun? It looks far sharper at 720p on the X360 than it did when I had the PC version, which I sold to another Hardforums subscriber. I don't buy this developer talk about how this isn't really HD content, because the proof is seen on my screen

So you are saying that something as gorgeous as PGR3 isn't running in real high def?

What about Condemned? The game looks really good in 720p

MLB 2K6 has bugs, but there is nothing like looking at Griffey Jr. and Randy Johnson in 720p.

What about Madden 06? That game is one of the most gorgeous of the launch games for the X360 in 720p. Nothing like watching Chad and Rudi Johnson in high definition.

I think PGR3 is running at something like 1024x600 or something along those lines. You'd have to do a framebuffer capture to get the exact rez, though it's upscaled.
 
Jason711 said:
they make the 360's gpu sound pretty weak.

They could be exaggerating with the 7600gt references. It DOES have shader features that are beyond DX9, for instance. That said, it's not as powerful as MS and ATI want you to believe, and the article writer makes that quite clear. The PS3 didn't get off unscathed, either.
 
they surely made that quite clear.. im wondering if it will end up a bottleneck. they havent even come close to taking advantage of the processor.
 
Jason711 said:
they surely made that quite clear.. im wondering if it will end up a bottleneck. they havent even come close to taking advantage of the processor.

I don't know if you'd call it a bottleneck, per say. More like a limit to how much bandwidth you can throw through the system. And devs have certainly come close to "taking advantage" of the processor, in that they are maxing them out in some of the most recent games. With that said, they still have room to learn how to use it more effeciently.
 
steviep said:
I think PGR3 is running at something like 1024x600 or something along those lines. You'd have to do a framebuffer capture to get the exact rez, though it's upscaled.


I don't think anything rendered at a resolution that low would look that good. The screenshots in PGR3 look just as good as the game itself does. You really should go rent a X360 from your local video store along with a few games. You might actually have some fun with it. Since the X360 games arn't your style, this would give you something to have fun with for a weekend of so. My only complaint about PGR3 is that the AA and AF could be a little better, although we got exactly what the advertised product was. There is not a single other racing game on the market that looks as good, runs as smooth, and is as detailed with the exception of RR6 which blows every game away.
 
I play PGR3 at my buddy's house all the time. The game looks fantastic, one of the best looking titles out there, even without much AA and AF. Aside from Oblivion (which I play on my PC) there aren't too many titles that interest me on the console. When there are, I will be purchasing it. That said, read the Beyond3D discussion in the link for more on that particular issue. I'm sure if you dig, you can find an actual framebuffer capture of PGR3 that is running at lower than 720p and upscaled. I believe, so far, it is the only game that does so. But according to this Japanese article, we may see more of those if these hardware limitations come into play.
 
Why does it matter if the games are rendered in true HD res or upscaled? Just go by the end result, which looks fantastic. Who really cares how it gets there?
 
Axoman said:
Why does it matter if the games are rendered in true HD res or upscaled? Just go by the end result, which looks fantastic. Who really cares how it gets there?


That's exactly how I feel at this time. However, there is a big difference in image quality from something being rendered at a higher resolution, or being just upscaled. For instance, you can play Halo 2 at 720p in a X360, and it looks really horrible, even though it's been upscaled. Of course, Halo 2 is one of the best looking games to ever grace the original Xbox, so that's not a fair accusation. However, look at games like RR6, COD2, Condemned, MLB 2k6, Madden 06 next gen, and many other X360 games. They really look like they were designed to run natively rendered in high definition. However, there are a few games out there like Tomb Raider, PD0, Far Cry Instinct Predator, and a few others that just don't quite look like they were rendered natively in 720p. They almost look like they were upscaled. However to be fair, Tomb Raider really seems like it's just a port of the other versions, along with Far Cry. The running water streams are probably the most impressive part of it's graphics.
 
junehhan said:
That's exactly how I feel at this time. However, there is a big difference in image quality from something being rendered at a higher resolution, or being just upscaled. For instance, you can play Halo 2 at 720p in a X360, and it looks really horrible, even though it's been upscaled. Of course, Halo 2 is one of the best looking games to ever grace the original Xbox, so that's not a fair accusation. However, look at games like RR6, COD2, Condemned, MLB 2k6, Madden 06 next gen, and many other X360 games. They really look like they were designed to run natively rendered in high definition. However, there are a few games out there like Tomb Raider, PD0, Far Cry Instinct Predator, and a few others that just don't quite look like they were rendered natively in 720p. They almost look like they were upscaled. However to be fair, Tomb Raider really seems like it's just a port of the other versions, along with Far Cry. The running water streams are probably the most impressive part of it's graphics.
Halo2 doesnt get upscaled dude, its litterally rendered in 720p, there are side by side comparasons and the differnce in terms of clarity and jaggies is pretty big(its obviously AA and AF)
 
Axoman said:
Why does it matter if the games are rendered in true HD res or upscaled? Just go by the end result, which looks fantastic. Who really cares how it gets there?

Thats pretty much the same argument people use for Non HD game consoles but every one else (360 fans mostly) scream "It has to be full blown true HD or it will look like shit!!!1" :eek:


Just my little observation..
 
[T5K]thrasher said:
Thats pretty much the same argument people use for Non HD game consoles but every one else (360 fans mostly) scream "It has to be full blown true HD or it will look like shit!!!1" :eek:


Just my little observation..

A good observation, but one that I don't buy. 480p looks pretty damn good on XBox/Gamecubes hooked up to a good TV, as do DVDs. Many games also run a LOT smoother at 480p on the 360, simply because there are some bottlenecks at 720p+ and the devs haven't quite learned how to get around some of them yet.
 
paranoia4422 said:
Halo2 doesnt get upscaled dude, its litterally rendered in 720p, there are side by side comparasons and the differnce in terms of clarity and jaggies is pretty big(its obviously AA and AF)


I don't know about that. It's probably true since I have no proof otherwise, but I honestly can't tell the difference from playing Halo 2 on my old Xbox, and then playing it on my X360. The only thing I can tell is the AA difference. One thing that I did notice, is that I got frame rate dips on the X360 in certain high action parts of the game. It doesn't bother me though as i'm keeping the old Xbox around for my older games, and to use it as a cheap DVD player.
 
steviep said:
What the author of the article is trying to say (never mind the anonymous devs, the article itself has some interesting revelations too) is that the 10mb on-die isn't enough to do AA at HD resolutions.

Then I'm confused how I can have both HDR and AA enabled when I'm playing Oblivion on my 360 in 720p.
 
Obviously tiled AA of some kind. It's not full screen AA unless the resolution of the game is less than whatever value the original article writer said fit into 9.9mb of on-die ram.
 
junehhan said:
For Quake 4, COD2, and Gun, yes.

I was especially impressed with COD2 as it looked a LOT better than the PC version. The smoke looked so much smoother and more realistic, as well as the character detail. The frame rates were also smoother and more consistent. I bought the X360 version of these games because the PC versions were running a bit wierd with my dual core cpu. Luckily, I was able to sell all 3 games really quickly in the sell/buy section of this forum.
I agree the 360 version was silky smooth, but no way near better in terms of graphics, i had the same setup as you and the 7800gtx on your rig would blow the 360 version away,
1280x720 with 4xaa and 16xaf looked a lot beete and played silky smoothwith those settings on the gtx, theres a thread on av forums which discusses this anyway

An amazing release for a launch game though
 
Skirrow said:
I for one think this information as pretty accurate, but a little premature. Since for one the hardware of the 360 is perfectly capable of outputting HD resolutions. While this early on in its lifetime it COULD just be upscaling lower resolutions, though that could be attributed to developers not getting to grips with the hardware yet. If my old Athlon XP 2800+ with 9800pro can handle 1280x780 resolutions at around 40+ fps then sure as hell the 360 can handle it. Even WITH AA+AF.

This is what bothers me about the Xbox360. We've been heralding the end of jaggies and blurry textures since the days of the 9800pro, and it seemed to be confirmed when Microsoft mandated the "1280x720p, 4xAA minimum" spec. Now I get the console in my home, and I'm back where I was 3 years ago: playing a jaggy, blurry mess. It's frustrating when you know the potential, and look forward to something so much, but it just doesn't deliver.

Here's to hoping developers can come up with something.
 
Back
Top