New Tim Sweeny Interview - UE3 Fillrate Dependent

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
I'm putting this in the video card forum for one important reason:

http://forums.epicgames.com/showthread.php?t=571019

PCGH: How exactly are you utilizing the functions of Direct X 10?

Epic: Unreal Tournament 3 will ship with full DX10 support, with multi-sampling
being the biggest visible benefit of the new graphics interface. Additionally,
with DX10 under Vista we have the possibility to use the video memory more
efficiently, to be able to display textures with a higher grade of detail as it
would be possible with the DX9 path of Vista. Most effects of UT3 are more bound to the fillrate than to basic features like geometry processing. That's why DX10 has a great impact on performance, while we mostly forgo the integration of new features.

Doesn't sound like good news for the R600 which has concentrated on geometry performance over texture/fillrate performance. The G80 has a fillrate advantage currently.

Hrmm....

Lots of other good info in the interview as well. No AA in DX9 due to using deffered shading :( But AA in DX10
 
It seems strange, and somewhat naive and shortsighted for a company like AMD/ATI, to build a product that is stronger on geometry processing than pixel processing, when the general trend in games (for a while now) has been more toward pixel processing with less dependence on geometry processing.
 
I'm putting this in the video card forum for one important reason:

http://forums.epicgames.com/showthread.php?t=571019



Doesn't sound like good news for the R600 which has concentrated on geometry performance over texture/fillrate performance. The G80 has a fillrate advantage currently.

Hrmm....

Lots of other good info in the interview as well. No AA in DX9 due to using deffered shading :( But AA in DX10

It's by comments like these, that I really don't understand ATI's architectural decisions for R600.
Thinking about the future and skipping the present, is a big mistake. They knew that shader usage would increase in the future, but that fill rate for multi-texture, was still very important, since even shaders can store and access textures. So why didn't they balance R600, with a good amount of shader power coupled with a strong peak multi-textured fill rate ?
And here we are, talking about a future game, that is bound by fill rate. I guess we'll have to wait and see, but this definitely doesn't look good for R600, as you pointed out.
 
Thanks for the read, sounds like dual/quad cores will help out a bit... Wonder how quad will do with physics vs a physics card.
 
the unreal series owns my life, I'm impatiently waiting for ut3 to breathe some life back into the ut community.
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.


Did you read the review? Sweeny said ut3 will run smooth on x19XX/7900 class hardware. And VERY smooth on current dx10 hardware.
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.

Well, if Crysis is any indication of that, we've seen quite a few DX10 videos, where the game ran smoothly on a 8800 card. Also, Lost Planet DX10 numbers, seem to show that the GTX is very much playable with all settings maxed @ high resolutions.
Going back to Crysis, the game seems highly optimized and Crytek claims it runs very well, on old DX9 cards, with most, if not all the eye candy options on or turned to the max and the game is as realistic as a game can get. Unless PC gamers, get a crappy console port (like we did with Rainbow Six Vegas), I'm guessing UT3 will also be highly optimized and even though it will stress current cards @ max-settings and high resolutions, it won't be to a point where they are unplayable.
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.
The 8800 Ultra is the 8800 refresher. The next nvidia card will be their next gen card. The G9x series.
 
The 8800 Ultra is the 8800 refresher. The next nvidia card will be their next gen card. The G9x series.

82303_main.jpg
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.

No, my Radeon 9700PRO handled many directx 9 games very, very well.
 
Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9?

Nah, that only happened back then if you bought a nVidia FX card instead of a ATI card. Now it looks to be the other way around.

No, my Radeon 9700PRO handled many directx 9 games very, very well.

Same here, my 9700pro played the hell out of HL2 and BF2.
 
Nah, that only happened back then if you bought a nVidia FX card instead of a ATI card. Now it looks to be the other way around.



Same here, my 9700pro played the hell out of HL2 and BF2.

Man, I dunno wtf is up with that... but the internet cafe I have been going to when on the road in San Diego has P4 2.4GHz systems with 1GB of memory and Radeon 9700 pros, and they totally suck it up at both 800x600 and 1024x768 with ALL settings on low in counterstrike source.
 
Unfortunately I am unable to read the link from here... sorry for the misinformation...

oh okay, here's a quote for you then:
PCGH: How do the general hardware requirements look like?
Epic: Since optimization work is still ongoing, these details may change every
day. Generally speaking, the game runs quite smooth with DX9 hardware released by NVidia and Ati since 2006. On high-end cards, including the DX10 models, UT3 runs incredibly smooth already. Additionally, we also support shader 2.0 graphics hardware, with only a few technical limitations.

=)
 
UT3 is a good reason to upgrade all the core technologies in your PC, because it's going to make good use of:

64bit CPU's with a 64bit operating system
64bti Allowing more than ~ 2.5Gb to 3.5Gb of RAM
Dual or Quad Core processors
DX10

You know what that means, Vista 64bit, 4Gb of RAM, a 8800GTX and a dual/quad core CPU

OH WAIT A SECOND :D
 
UT3 is a good reason to upgrade all the core technologies in your PC, because it's going to make good use of:

64bit CPU's with a 64bit operating system
64bti Allowing more than ~ 2.5Gb to 3.5Gb of RAM
Dual or Quad Core processors
DX10

You know what that means, Vista 64bit, 4Gb of RAM, a 8800GTX and a dual/quad core CPU

OH WAIT A SECOND :D

Except they even mentioned that they don't use Vista 64bit and that they don't use more than 2Gb of ram.

PCGH: Will UT3 players be able to benefit from a 64 Bit environment and is there a 64 Bit version anyway?
Epic: To assure compatibility, we tested UT3 with Vista x64 as well. Nonetheless, we're planning to wait and see first, until the OS and its applications will have ripened, before we'll be taking further steps in the 64 Bit direction. With UT2004 we were one of the first developers who ported a title for Windows XP x64. We would've liked to do this with UT3 and Vista x64 as well as shifting all the PCs we're currently developing on to the 64 Bit version of Vista. Unfortunately, full software and driver compatibility isn't there. The basic OS runs stable and it's fun to work with it isolated. But as soon as you want to print something or want to run Maya or 3DSMax together with some third-party plugins you'll get massive problems. But I am sure those can be fixed via service packs and software updates, so PCs with 4 to 8 gigs of ram can establish themselves during the next 12 months.
 
Man, I dunno wtf is up with that... but the internet cafe I have been going to when on the road in San Diego has P4 2.4GHz systems with 1GB of memory and Radeon 9700 pros, and they totally suck it up at both 800x600 and 1024x768 with ALL settings on low in counterstrike source.



the problem is the p4 2.4 TBH
 
Except they even mentioned that they don't use Vista 64bit and that they don't use more than 2Gb of ram.

I didn't read the whole thing :eek:

Either way the fact that they don't use more than 2Gb doesn't mean its bad to have 4Gb, Vista is using about 1.3Gb for me at the moment, thats with minimal apps open :D
 
I didn't read the whole thing :eek:

Either way the fact that they don't use more than 2Gb doesn't mean its bad to have 4Gb, Vista is using about 1.3Gb for me at the moment, thats with minimal apps open :D


Why does this have to be explained over and over again. Vista uses memory differently than XP! When you run a app/game vista will transfer the memory to the running program.

http://www.codinghorror.com/blog/archives/000688.html
 
It seems strange, and somewhat naive and shortsighted for a company like AMD/ATI, to build a product that is stronger on geometry processing than pixel processing, when the general trend in games (for a while now) has been more toward pixel processing with less dependence on geometry processing.

I agree. I guess somehow they thought this would yeild them an advantage?...maybe in the hopes that the card would perform exceptionally better when given the right environment + 3d app? seems like too much of a risk tho considering how late they are to the party.. maybe they're just after keeping the folding @ home performance title? ;)
 
UT3 is supposed to be released sometime around September I believe, won't the 8800 refreshes be out around then too? Also isn't it possible the first dx10 games will pretty much stomp current dx10 cards? Wasn't it the same with dx9? I don't imagine until AMD/ATI and Nvidia come out with their next cards will we see acceptable performance in dx10. I guess only time will tell.

http://utforums.epicgames.com/showthread.php?t=570995

Mark Rein said:
We don't know when the game will be released but it won't be ready in time for September 3rd. In addition the idea that we would release the game in the US and UK several weeks apart is just plain silly.
 
Bioshock is U3 engine, so looks like another point for Nvidia for my next upgrade
 
Man, I dunno wtf is up with that... but the internet cafe I have been going to when on the road in San Diego has P4 2.4GHz systems with 1GB of memory and Radeon 9700 pros, and they totally suck it up at both 800x600 and 1024x768 with ALL settings on low in counterstrike source.


The P4 2.4. Source needs CPU.
 
Strange because latest benchmarks with R600 are showing it doing very well in Vegas over the GTX which is a UE3 game.
 
I was gonna say. I played EQ at 1600x1200, Doom3 at 1024x768, HL2 at 1024x768, and RO at 1024x768 all at max (or near max) quality settings at very comfortable framerates on my Radeon 9700 Pro. That card was the mack-daddy of all video card investments.
 
Am I understanding this right. No AA at all in DX9 mode? If so I hope the edges are smooth enough on my 1440x900 resolution. Ever since I got my X1900XTX(first card I owned above a low-end card) and was able to enable AA, I can never look at jaggies again.
 
Am I understanding this right. No AA at all in DX9 mode? If so I hope the edges are smooth enough on my 1440x900 resolution. Ever since I got my X1900XTX(first card I owned above a low-end card) and was able to enable AA, I can never look at jaggies again.

No kidding. I felt pretty murderous when I downloaded Roboblitz via Steam only to find out it was jaggy city. A fun game, but damn, those edges are ugly!
 
It seems strange, and somewhat naive and shortsighted for a company like AMD/ATI, to build a product that is stronger on geometry processing than pixel processing, when the general trend in games (for a while now) has been more toward pixel processing with less dependence on geometry processing.



Whats far worse,is that the UE is the most licensed 3d engine for pc and console gaming out there !

Which means that this will have even more far reaching effects on ATI users,as many games in the future (read 2 to 4 year +) will be using this engine (heck,many already have and will continue to do so.) Just off the top of my head,Bioshock uses it.Anyone have a list of recent UE3 license's ? :eek: Great article,lots of interesting info.

Some of the engineer's at AMDati should be canned,maybe heads have already started to roll...
 
Am I understanding this right. No AA at all in DX9 mode? If so I hope the edges are smooth enough on my 1440x900 resolution. Ever since I got my X1900XTX(first card I owned above a low-end card) and was able to enable AA, I can never look at jaggies again.

Yep, it is a growing trend now for games to be using deferred shading, GRAW, R6 Vegas, Stalker, and now UT3. It is a cool technique and all but with DX9 it has the limitation with video cards of not supporting multisampling. Fortunately multisampling DOES work in DX10 with deferred shading.
 
Brent, can you explain deferred shading in a nutshell?

There is no easy way to explain it. It's a potentially easier way to implement multiple per-pixel lights, using a texture called a G-buffer.

Normal method of applying multiple lights:

Single-pass (multiple lights appllied to a single pixel in a single shader program).

Multi-pass (multiple shader programs, each applying a single light), where you can potentially reuse shader programs, but there's more geometry and anistropy overhead.

Each method has downsides, and deferred shading / lighting attempts to improve on that.

We've had the capability to do this since the Radeon 9700 Pro and the GeForce FX series, but only now do we have hardware with the power to do this well.
 
Back
Top