At what point will console graphics be outdated?

To add to that, PC gaming has gotten simpler with installs. Most newer games you don't have to fiddle around. You can just install a game and go. The messing around with settings and files are for those who want to get as much as they can get. Really, an optional thing.

Platforms like Steam especially have made installing a game a one-click affair. Don't even have to buy the physical disc as you have to with all big console games.
 
I've never seen a console come out that had better graphics than PC. As far as comparison to PC games, consoles have always usually been outdated even before their release.
 
Back in the days of the GC and PS2 you may have had a point, but today, at least with PS3 and XBox 360 games a significant amount of patches have to be installed to make things work correctly, or even to be allowed online. And not just games, the console's firmware keeps getting new updates all the time, which are far too often a requirement if you want to experience online gaming.

Even the Wii doesn't fully escape this, but at least its updates are mostly optional.

Optical media also sucks big time. I'd much rather have the game on a HDD or SDD or even a RAM drive instead of having to wait a full minute for a level to get shoveled in from the shiny disc with the DVD drive sounding like a small animal getting choked to death (thanks, Microsoft, for finding the noisiest DVD-drives in existence!).

Yeah, console gaming is great. In 1998 :p


I respectfully disagree. these updates that users are getting on the console for the wide majority are painless and quick endeavors. Put disk in it Tells you it needs to update you acknowledge and away it goes! PC updating can be a nightmare and we all know it! The patching will always be better on consoles for a simple fact of unified architecture. I will concede that steam is making this match consoles as far as updating ease. But some still do not buy into Digital Distribution for full games. And yes optical Media does suck which is why I can install most of XBOX 360/PS3 games and see most of the time a performance boost. I am not raging on our hobby but to not understand that PC graphic aren't light years ahead of consoles and that consoles offer a great deal of tech that is simple to interface with to me is a little silly.
 
I've never seen a console come out that had better graphics than PC. As far as comparison to PC games, consoles have always usually been outdated even before their release.

I'd like to have seen your IBM PC outperform a SNES. Without investing significantly more than the price of a SNES for a decent video- and soundcard and everything ;)

What the N64 did in 1996 was pretty impressive for its time too, with a reprogrammable GPU and everything. That one definitely blew PC graphics away.
 
I respectfully disagree. these updates that users are getting on the console for the wide majority are painless and quick endeavors. Put disk in it Tells you it needs to update you acknowledge and away it goes! PC updating can be a nightmare and we all know it!

I don't, but please tell me about it. Last time I updated a PC game was in 1998-ish with Half-Life, and I thought that was pretty painless already.
 
I'd like to have seen your IBM PC outperform a SNES. Without investing significantly more than the price of a SNES for a decent video- and soundcard and everything ;)

What the N64 did in 1996 was pretty impressive for its time too, with a reprogrammable GPU and everything. That one definitely blew PC graphics away.

Quake came out in June of 1996, GLQuake followed in January. Blew PC graphics away? I don't remember Super Mario World having much in the way of textures.

The SNES is a better console to argue that it beat PC graphics, because it did. One of the main things the SNES had was Mode 7, which allowed it to do a lot of special effects PC games at the time just couldn't do. On the other hand, Kings Quest V came out in 1990(the year the SNES launched in Japan), and it didn't look *substantially* worse than Super Mario World, it also had voice acting too ;)
 
Quake came out in June of 1996, GLQuake followed in January. Blew PC graphics away? I don't remember Super Mario World having much in the way of textures.

The SNES is a better console to argue that it beat PC graphics, because it did. One of the main things the SNES had was Mode 7, which allowed it to do a lot of special effects PC games at the time just couldn't do. On the other hand, Kings Quest V came out in 1990(the year the SNES launched in Japan), and it didn't look *substantially* worse than Super Mario World, it also had voice acting too ;)

Then again on the other side of the fence there was the Amiga which had really impressive graphics and sound back when the NES was still quite the console to have.

Is Amiga a PC?

*runs and hides*

:D
 
I'd like to have seen your IBM PC outperform a SNES. Without investing significantly more than the price of a SNES for a decent video- and soundcard and everything ;)

What the N64 did in 1996 was pretty impressive for its time too, with a reprogrammable GPU and everything. That one definitely blew PC graphics away.

It's hard to put N64 and good technology in the same thought.

I think the bottom line is that as a business and from your average consumer's standpoint, consoles are pretty good. Enthusiasts obviously aren't going to be very excited about what the masses are doing. Hence, PC gaming.

I own a PC, Xbox360, and have a PS3 in the apartment (not mine) and I like all of them for their own merits. Obviously if a game is out for both consoles and PC, I get the PC one - cheaper, more options, looks better, etc.
 
This thread isn't(or really shouldn't) be about any of the "other" merits of console gaming, though. Console graphics are almost always outdated, depending on your standards. It probably won't be until the next next-gen console comes out that the average person will start to think that, though.(Edit: And then, only about consoles preceding that gen.)
 
Depends what you mean when you say "outdated"

When consoles were released they were already outdated, you could build PC's which had significantly better CPU and GPU power on launch day, especially if you were willing to invest in Crossfire or SLI.

There's a lot of other ways you could awkwardly define "outdated" but most of them the consoles were outdated on their release day. They're fairly flimsy cheap pieces of hardware for market which demands cheap prices for their entertainment, you get what you pay for, expecting any console to not be outdated is a bit silly really.
 
I remember Oblivion being essentially the same on console and PC when it first launched. Only the 7 series and the x1000 series graphics cards were around. As soon as the 8800's launched consoles couldn't compete anymore. Time is relative so calling something dated is a relative comparison. Console graphics in comparison to consoles won't be dated until the next higher level console is available. That and not all console games are equal when it comes to graphics. Consoles, for the most part, dictate the quality of graphics engines we receive as PC gaming enthusiasts. All we gain is an increase based on what is available to us.
 
I remember Oblivion being essentially the same on console and PC when it first launched. Only the 7 series and the x1000 series graphics cards were around. As soon as the 8800's launched consoles couldn't compete anymore. Time is relative so calling something dated is a relative comparison. Console graphics in comparison to consoles won't be dated until the next higher level console is available. That and not all console games are equal when it comes to graphics. Consoles, for the most part, dictate the quality of graphics engines we receive as PC gaming enthusiasts. All we gain is an increase based on what is available to us.

Another point is that whenever a console is released, the true power of that console isn't realized until a good couple years into it's life. I mean, Shadow of the Colossus was the last game I bought for Playstation 2 and that was released like 5 or so years after the console and it's graphics were better than anything i've seen then (on consoles). So when you create a console like the PS2, game developers never have to worry about a change in hardware, where with PC games, new hardware comes out all the time to raise the boundaries of what's possible on a PC as far as graphics.

On PCs it seems there is never much room for much optimization for any one particular system because the technology gets pushed out so quickly you have to make it so it works on a large variety of configurations, where with consoles you can spend the entire development cycle focusing on how to make it look good on that one system because there are no variables. Not everyone has a $2000+ gaming system to play a game on the highest graphics settings, but with a console it's constant across the board and you don't have to worry about different configurations. All developers have to do is how to squeeze the most juice out of the console.

So you could say it's also subjective in as much as what's done on that console for that particular time, because I was completely blown away when I saw S.O.T.C. and that was years after the initial release and i'm sure there were already better looking PC games.
 
Another point is that whenever a console is released, the true power of that console isn't realized until a good couple years into it's life. I mean, Shadow of the Colossus was the last game I bought for Playstation 2 and that was released like 5 or so years after the console and it's graphics were better than anything i've seen then (on consoles). So when you create a console like the PS2, game developers never have to worry about a change in hardware, where with PC games, new hardware comes out all the time to raise the boundaries of what's possible on a PC as far as graphics.

The problem with the PS2 was that for the first two years of its life Sony didn't bother to provide devs with a proper SDK. This meant little to no documentation, poor tools where available and a lot of guessing. This explains why the first generation of PS2 games looked little better than the PS1 games released around the same time.

MSFT on the other hand, for example, has always provided its devs with proper SDKs for both the original XBox and the 360. This means that one sees far less of a jump in quality over the life of the console.
 
The problem with the PS2 was that for the first two years of its life Sony didn't bother to provide devs with a proper SDK. This meant little to no documentation, poor tools where available and a lot of guessing. This explains why the first generation of PS2 games looked little better than the PS1 games released around the same time.

MSFT on the other hand, for example, has always provided its devs with proper SDKs for both the original XBox and the 360. This means that one sees far less of a jump in quality over the life of the console.

MS is a software company and Sony is a hardware company- still no excuse for sdk issues.
 
The problem with the PS2 was that for the first two years of its life Sony didn't bother to provide devs with a proper SDK. This meant little to no documentation, poor tools where available and a lot of guessing. This explains why the first generation of PS2 games looked little better than the PS1 games released around the same time.

MSFT on the other hand, for example, has always provided its devs with proper SDKs for both the original XBox and the 360. This means that one sees far less of a jump in quality over the life of the console.

This can also be said about PS1 and PS3. it is how sony does things.
 
Its all kind of irrelevant because the average console gamer doesn't care about keeping up with the latest and greatest. They want the ease of buying a new console (for the price of a video card mind you), and getting a significant upgrade. Its pretty much the same as a pc gamer upgrading their video card and expecting similar results. The difference here being, console gamers simply don't care and are perfectly content with optimized software that continues to get better over the consoles life. Something that actually works in reverse for pc gamers.

When you build a high end gaming pc, from that day forward newer games gradually run worse and worse, whereas the console software is better optimized due to a standard and games run and look better over time. While there is something majestic about buying a new high end video card and playing the latest game at maxed settings, there is something very practical and smart about investing your money into a console.

Its the sole reason some of my best friends who've always been pc enthusiasts have long since converted to gaming on consoles.
 
Last edited:
there is something very practical and smart about investing your money smarter into a console.

Not smarter if you level the playing field. If I wanted to use this PC (sig) to game on the same display as I would a console (reasonably sized 1080p display?), then you have to consider this initially very expensive machine was able to play games at that resolution and maxed image quality settings. At some point, consoles will render native 1080p with an arbitrary level of detail. At that point, this PC will still be able to display the same resolution with equal image quality, presumably throughout the lifespan of that console.

Visually, graph two lines. One line will be a very shallow negative slope line representing the ability of a static hardware configuration (PC) to play games as they're released, over time. Another line represents consoles' collective ability to do the same thing for a given resolution with equal graphical options. This line begins at a lower point but continually iincreases given new consoles and better optimization of existing hardware.

At some point, the lines will cross representing a period of relative equality. That is the point at which the PC owner will have to spend more money. The initial investment may be massive, but given that the line traced on said graph is shallow enough (games that are pathetically easy to run, good coding) that sum may be less than the continuous expenditure necessitated by the bottom line (consoles), especially if the development cycle for the industry as a whole is sufficiently slow.
 
I remember Oblivion being essentially the same on console and PC when it first launched. Only the 7 series and the x1000 series graphics cards were around. As soon as the 8800's launched consoles couldn't compete anymore. Time is relative so calling something dated is a relative comparison. Console graphics in comparison to consoles won't be dated until the next higher level console is available. That and not all console games are equal when it comes to graphics. Consoles, for the most part, dictate the quality of graphics engines we receive as PC gaming enthusiasts. All we gain is an increase based on what is available to us.


No way, I had an X1900XTX and it was way better on the PC. The grass fade on consoles was about 2 feet away.
 
Good visualization, and a great point. I was making very high level comments about the differences really but you make a good point. In the end, we are on an enthusiast forum discussing enthusiast type things... like when are console graphics outdated.

edited out, don't want to start a war.. lol
 
Last edited:
IMO, until I can play a console game with AA, they will forever be "outdated." I hate the jaggies!
 
Never, your everyday gamer doesnt want to spend triple the money for a little improvement on graphics.. I mean really, graphics are far enough along these days that its not a big deal.
In addition, Consoles will continue to improve just like PCs do
 
wtf-14.gif

:D.
 
Consoles = closed-box PCs today. Sure their hardware is a little different/proprietary than what you could put in a PC case, but that is just arguing semantics. As far as what it does (and the direction the manu. are pushing their consoles capabilities), consoles are just nice (closed-box) HTPCs.

But on topic: since day 1 for this generation. I remember seeing Oblivion on 360 when it came out and I think I almost puked cause the texture resolution was so pathetically low (thx consoles for this being an issue on the PC version as well, but at least on PC we were able to mod it to look nice).

Fast forward to today: watching DA:O and a lil Fallout 3 on PS3 and I felt like puking cause the texture resolution was so low. Everything was a blurry mess (yes this was on an HDTV).

I made a comment about how low the texture res was and the people playing didn't even seem to notice, but none of them have ever been PC gamers so I think it was more them being blissfully ignorant of something better.

BTW, the argument that PC gaming costs $2,000 is stupid now; I think I heard 1995 calling and they wanted their lame excuses back. Everyone already has a PC, all you need is a graphics card; and <$200 will get you a card now that can play any game at a higher fidelity than a console.
 
Consoles = closed-box PCs today. Sure their hardware is a little different/proprietary than what you could put in a PC case, but that is just arguing semantics. As far as what it does (and the direction the manu. are pushing their consoles capabilities), consoles are just nice (closed-box) HTPCs.

But on topic: since day 1 for this generation. I remember seeing Oblivion on 360 when it came out and I think I almost puked cause the texture resolution was so pathetically low (thx consoles for this being an issue on the PC version as well, but at least on PC we were able to mod it to look nice).

Fast forward to today: watching DA:O and a lil Fallout 3 on PS3 and I felt like puking cause the texture resolution was so low. Everything was a blurry mess (yes this was on an HDTV).

I made a comment about how low the texture res was and the people playing didn't even seem to notice, but none of them have ever been PC gamers so I think it was more them being blissfully ignorant of something better.

BTW, the argument that PC gaming costs $2,000 is stupid now; I think I heard 1995 calling and they wanted their lame excuses back. Everyone already has a PC, all you need is a graphics card; and <$200 will get you a card now that can play any game at a higher fidelity than a console.

I paid 75 dollars last summer for a 9800gtx used... so yeah, i'd say less than 200 dollars.

I think I paid about 500 dollars for my rig total, Gigabyte p45 chipset, e8500 at 4.0ghz, 9800gtx, 4 gigs ram, 400gb SATA drive, antec 900 case, Corsair tx750 PSU. Plays BC2 butter smooth, and looks much better than what a xbox360 can push out.

Then again, the only components I bought new in my computer was the case, HDD and the RAM (which was only 45 dollars anyways). I tend to buy all my parts used off of our FS/T forum. You can get some mega-deals just browsing around and looking through random threads.
 
Console gamers I know you love your platform but its really past time to let it goooo....... The 360 is over 4 years old followed by the PS3. They can in no way shape or form compete with today's gaming PCs in the graphics arena.

If a cross platform title doesn't look significantly better on the PC blame the developer not the hardware.

Its time you join PC gamers and the PC as a gaming platform as its rise to power yet again. It happens every couple years. Next gen console comes out. Console trolls flame PC. A couple years later the PC outpaces the consoles. It the circle of life of gaming hardware. The neat this is, the PC is the steady train that keeps on advancing yet I can continue to play very old games on it.

As I said the PC is on the rise again. Start building your new PC! Here's a couple articles to help you.
http://www.maximumpc.com/article/features/how_build_awesome_pc_647

http://www.bit-tech.net/hardware/buyers-guide/2010/03/09/pc-hardware-buyers-guide-march-2010/1

http://adrianwerner.wordpress.com/games-of-2010/
 
Console gamers I know you love your platform but its really past time to let it goooo....... The 360 is over 4 years old followed by the PS3. They can in no way shape or form compete with today's gaming PCs in the graphics arena.

If a cross platform title doesn't look significantly better on the PC blame the developer not the hardware.

Its time you join PC gamers and the PC as a gaming platform as its rise to power yet again. It happens every couple years. Next gen console comes out. Console trolls flame PC. A couple years later the PC outpaces the consoles. It the circle of life of gaming hardware. The neat this is, the PC is the steady train that keeps on advancing yet I can continue to play very old games on it.

As I said the PC is on the rise again. Start building your new PC! Here's a couple articles to help you.
http://www.maximumpc.com/article/features/how_build_awesome_pc_647

http://www.bit-tech.net/hardware/buyers-guide/2010/03/09/pc-hardware-buyers-guide-march-2010/1

http://adrianwerner.wordpress.com/games-of-2010/

if i could afford a gaming pc i would build one but im a teenager with no job.

Consoles are on par with PCs for the first few months that they are out, i hope that the next consoles to come out at least have have half a gig of ram and vram, and use dx11 architecture, they should have hardware as powerful or close enough to whatever is topline in PCs when they come out
 
Consoles seem to be the better platform IMO, since the developers can fully optimize them to their full potential, and multiplayer is balanced so you don't have to worry about those guys who have the best hardware and is getting faster frame rates than you and so on..

if developers didn't have to work with making PC games playable with a ton of hardware, the games would look ten times better than consoles
 
This kind of question is silly and pointless. The only reason Consoles get "outdated" is because you have to pump in $500 or so every six months on top of an initial $2000 investment to get the best graphics.
 
Back in the days of the GC and PS2 you may have had a point, but today, at least with PS3 and XBox 360 games a significant amount of patches have to be installed to make things work correctly, or even to be allowed online. And not just games, the console's firmware keeps getting new updates all the time, which are far too often a requirement if you want to experience online gaming.

Even the Wii doesn't fully escape this, but at least its updates are mostly optional.

Optical media also sucks big time. I'd much rather have the game on a HDD or SDD or even a RAM drive instead of having to wait a full minute for a level to get shoveled in from the shiny disc with the DVD drive sounding like a small animal getting choked to death (thanks, Microsoft, for finding the noisiest DVD-drives in existence!).

Yeah, console gaming is great. In 1998 :p

Exaggerate much? Christ that was terrible.
 
PC graphics may be better now but console graphics are definitely adequate. The Xbox graphics looked super out dated really early on because it rendered in such a low resolution. Now that most games render at 720p and the systems have enough memory to support nice textures, modern console games still look very good for the most part. Sure you get more environmental effects with nice PC cards but these are minor details that most do not even notice.
 
Console graphics technology has been outdated for years. Xbox360 runs on a Geforce 7 256meg gpu I believe and the Playstaion3 runs on the equivalent. As far as visuals, There are always ways for a developer to improve visual quality on aging hardware, but there is a limit to what the hardware can accomplish.
 
PC graphics may be better now but console graphics are definitely adequate. The Xbox graphics looked super out dated really early on because it rendered in such a low resolution. Now that most games render at 720p and the systems have enough memory to support nice textures, modern console games still look very good for the most part. Sure you get more environmental effects with nice PC cards but these are minor details that most do not even notice.

The reason why we don't see much improvement on PC counterpart is because that's as much as the developers cares to put out for the PC platform.

If a developer were to come up with an engine that really utilizes today's PC performance, it would be more than just a few minor details:p The difference would be day and night. IF only we had games that would take advantage of today's hardware


This kind of question is silly and pointless. The only reason Consoles get "outdated" is because you have to pump in $500 or so every six months on top of an initial $2000 investment to get the best graphics.
Whoa, this arguement is still being use against PC gaming:eek:

With or without PC platform, console graphics will still be stangnant, being limited by its hardware until a new generation of console. You can argue about the cost differences all you want, but it still doesn't change the fact that console graphics will get outdate after awhile
 
Last edited:
I'm sorry, like I do all too often I'm posting first and reading the rest after I've blasted my thoughts at the rest of you guys.

I'd say we just hit that point now, but were not actually yet of the traditional production envelope. As of 2009, from a hardware standpoint, the real-time 3D landscape really hadn't changed that much. APIs got better, geometry got more complex and textures increased in resolution density. Pixel shader effects were added to give the cool eye candy that's so iconic of the modern gaming industry, but all these effects existed in one primitive form or another back in 2001. But as of 2010, with rendering pipelines now being view as general purpose computational devices, the scope of a game from a technical perspective has taken a leap forward, a leap which the modified R500 and G70 found in the 360 and PS3 respectively cannot take.

Backing up a bit, if you take a look at any game since the integration of object oriented programming into gaming, the event loop followed a similar structure:
while the games not over:
get user input
script a response
execute parts of the script needing execution
fetch and render video resources

The middle two lines of my code here are entirely dependent on traditional sequential logic. From COD4 to Audiosurf, scripting the response, whether it be ordering AI units to move and shoot or simply queuing up the next series of notes, has revolved entirely around X86 logic.

Nvidia and AMD's ATI want you to look at it differently. Rather than be limited by compiled serial IA32 assembly language code they're suggesting that if you write instructions designed for parallel execution on their graphics hardware you can get an astounding result.

The easiest example for me to come up with is AI path-finding in first person shooters. The logic is incredibly involved and for the best looking end product (read: enemies that don't run for cover that faces the wrong direction) you need a huge number of comparisons even after heavy optimization. In Modern Warfare 2 if you fire at three unsuspecting enemies presumably the MW2 engine executes three instances of "find where I can go such that he cant shoot me" logic. This logic has to take any number of variables into account, and the more it does, the more fun the experience. Moving cover, cover that deteriorates, proximity to nearby allies (so as to not have 1 grenade kill 3 bots), return-fire opportunities, etc. If you have a single thread and thousandths of a second to resolve a path for these AI to follow, naturally there are going to be some variables that have to be ignored for efficiencies sake. The logic that generates these paths will have to be called sequentially, over and over, which takes time.

Its entirely possible that despite Nvidia's rantings about how wonderful parallel logic is, it might not help here, but its also entirely possible that it will. You can imagine that rather than having this path finding logical called sequentially for a number of different scenarios, if we could call it tens, hundreds, or thousands of times at the same instance spread across multiple "execution cores" we might be able to either increase the performance and/or the quality of the path finding algorithm and its solution.

But that's hypothetical, and there are plenty of developers investing huge amounts of man-hours into finding out what you can and cannot get out of this whole parallel computing thing.

One example that's not hypothetical and has been demonstrated to be hugely more efficient on a GPU is Quick Sort. Quick Sort and Merge Sort (a derivative of Quick Sort) are the two most widely used sorting algorithms in software. When you click the name heading in iTunes to order your songs by name or the new-posts button on hardforum there's a good chance you're making a call to one implementation or another of the Quick Sort algorithm. Citing Chalmers University, who used just mass numbers of pseudo random integers, sorting on a GPU is 10X more efficient than sorting on a CPU.

So what, path finding and Sorting are faster on a GPU, doesn't exactly give you a foundation for Crysis 3 huh. We'll if I knew what could be done with parallel programming I wouldn't be sitting talking about it. I'm not a very creative person, but this, as Nvidia puts it, paradigm shift (or as Intel puts it, useless silicon) could certainly lead to some new cool games. Maybe it could make a huge dent in the already existing gaming market, maybe it could create whole new games which simply couldn't be coded until this concept came along. Maybe the advent of parallel programming on a GPU is as big as the advent of indirection on PC's, maybe its as influential Segway, I donno. What I do know is that it isn't possible on modern consoles, meaning the financial incentives that would otherwise go into investigating this concept simply aren't there.

That said, the typical console release period is 5 years, and the Xbox 360 was released in 2006. I would be very surprised if Microsoft's Xbox division wasn't in communication with the Direct Compute guys over what the Xbox 720 (or maybe just Xbox 7 since the number seven seems to be working so well for Microsoft) will and will not be able to do.

hah, hope this essay was a worthwhile read :p
 
Last edited:
Console graphics technology has been outdated for years. Xbox360 runs on a Geforce 7 256meg gpu I believe and the Playstaion3 runs on the equivalent. As far as visuals, There are always ways for a developer to improve visual quality on aging hardware, but there is a limit to what the hardware can accomplish.


lol facts do you have them?

there is NO NVIDIA hardware in the xbox 360........they use a ATi soluton that superior to the PS3 GPU in just about every way.....It was also one of ATi's first unified shader designs.
 
When the pc gets a hit selling game that cant be done properly on consoles.

That would mean it got outdated the day crysis came out... maybe 3 years ago?

I'm actually glad the PC market is in sync with the console world in terms of graphics. Cause games like Crysis paint an unrealistic picture of the PC market so everyone thinks you need a 2000$ PC to PC game. Id rather play l4d2 or cod4 @ 60fps vs playing Crysis barely at 30fps. Playable game > pretty gfx. I mean you wouldn't watch a movie at 10fps would you?

Not the $2000.00 rig argument again! (Well, the previous falacy was $3,000 so i guess it's an improvement) My Crysis rig only cost $700.00 at the time the game came out!

My current one (sig) including the 22" monitor only cost under $1000 and it's my most expensive build yet. I'm not even sure where another $1,000.00 is supposed to go.

And please don't bring up watercooling, and the steel cases, you don't really need those for a gaming rig.

BTW, i sold the previous rig for about $500.00. So the actual cost is cut in half.
 
Consoles seem to be the better platform IMO, since the developers can fully optimize them to their full potential, and multiplayer is balanced so you don't have to worry about those guys who have the best hardware and is getting faster frame rates than you and so on..

if developers didn't have to work with making PC games playable with a ton of hardware, the games would look ten times better than consoles

lolwattbulb.jpg


Console graphics technology has been outdated for years. Xbox360 runs on a Geforce 7 256meg gpu I believe and the Playstaion3 runs on the equivalent. As far as visuals, There are always ways for a developer to improve visual quality on aging hardware, but there is a limit to what the hardware can accomplish.

You got it backwards, son.

The PlayStation 3 uses a G70/71 derived GPU with half of the back end disabled and is dubbed the RSX, which has a 128-bit memory bus with 256MB GDDR3. The memory is embedded on chip, but offers no advantages other than space saving. (Check out some de-lidded RSX shots on Google image search) Has something like 16GB/s bandwidth, which is about even with a GeForce 7600. In other words, it's sorely lacking. Take FFXIII for example, they have to resort to stipple alpha blending on hair and other transparencies rather than actual alpha-to-coverage to maintain performance. Basically, it's in between the 7600GT and 7800GT in actual performance.

The XBox 360 uses an ATi custom chip called Xenos. 48 unified pixel and vertex shaders with a 10MB eDRAM die embedded with Z buffering, 8 ROPs, and 8 texture units on deck. That eDRAM die basically gives you double the bandwidth available in a GTX 280, giving Xenos power to do things it might not be able to do otherwise.

So basically, because of consoles, we're bound on our high end PC systems to five year old graphics technology, which is pathetic. Never before have I seen such stagnation in graphics advancement in my 15 years of PC gaming. Thank Microsoft selling out and Sony for this. Be sure to tell them you appreciate it! :)
 
They've been outdated for a while, horribly outdated. I've recently been replying Crackdown co-op with a friend on the same 32" TV I use for a computer monitor. The frame rates, graphics, and especially lack of AA are just horrid in that game. Fable 2 doesn't look all that much better.
 
Back
Top