Bulldozer an option for next gen consoles?

Yea but then you're managing 2 discrete GPU's and possibly memory pools over a bus with who knows what latency. That really isn't desirable, you need some sort of big advantage to make it worth while.
 
Correct, dual-GPU setups are not ideal for any purpose-built machine designed to be manufactured on a single PCB. The less interconnections, the better: Keeps costs down and efficiency up.
 
They could go either way depending on how the GPU development is set up. Process wise there is nothing to restrict them in choosing GPU arch. per se. It'll really come down to power budget and economics of the die size. Generally though they're going to want the most advanced GPU they can get affordably since that will have to remain static throughout the life span of their next gen consoles.

Given that the console developers aren't shy about losing money on the hardware for at least the first year or even years of the console's life span you can't rule out them slapping a modified 8xxx GCN class GPU in there.


They've probably already chosen what GPU they'll use, which according to the rumor will be from AMD, and have probably already started work on it. Most likely they have a couple development teams at the very least. One working on the desktop PC GPU and another modifying it for console use at the same time for each company.

The choice they (Sony/MS/Nintendo) make is dependent on what AMD will license to them. It may be that AMD will not license GCN to them until its successor is out (that way any bugs can be worked out). The only time it was different was with the Gamecube.
ArtX developed Flipper for Nintendo, then was subsequently bought out by ATi (get this, the GeForce FX would have been dominant if ATi never purchased ArtX, weird, huh?)

Even Nvidia learned not to use your best tech on a console. The NV30 core was supposed to power the GeForce 4 Series, but was pushed back for NV25 because of delays with the XGpu.
 
What are you basing your idea that they (AMD) won't license a GCN derivative GPU to Sony/MS? I haven't even seen the most flaky of rumor mill sites even suggest that and there is no reason to believe that they won't since the console and PC markets aren't really direct competitors per se.
 
I have no doubt whatsoever that MS is going with another IBM CPU, for backwards compatibility, plus IBM can design it to Microsoft's specification. There's no chance at all the next Xbox CPU is anything but IBM.
 
Who cares.... I am hoping and it only makes sense that the new consoles (ps4 and xbox720 if you want to call it that) should be a small computer, with swappable parts.

that will finally make PC gaming and console gaming the SAME THING... this would be a win/win in many ways!
 
Who cares.... I am hoping and it only makes sense that the new consoles (ps4 and xbox720 if you want to call it that) should be a small computer, with swappable parts.

Erm then it wouldnt be a console. You seem to have forgotten the whole point of the console.

Consoles with swappable/upgradable parts would be a car crash of mega proportions. It would not be win/win.

A big selling point of consoles is that parents and granny needs only to know what type of console it is to buy games for, not whether they need to get the version for that upgrade, version etc.

Plus the concentrated optimisation goes out the window. Development costs increase...time....

What you really want...is a PC. Other people (the other 90%) just want a console.

There is a good reason why consoles are so popular and the PC section in the games stores is now less than one rack in the corner near the staff toilet.
 
Last edited:
trinity apu is still going to be to slow, its supposed to be 50% stronger than the llano apu, it sounds like a lot until you realize the llano apu is only about half as powerful as a nvidia 9600gt or hd3870 if your an amd fan. with a 50% increase its still slightly slower than a nvidia gt240 or a amd hd5570. to put that in perspective look at what a 5570 does in crysis. do you really think next gen consoles graphics would be this weak? (nintendo aside)

21607.png

I'm willing to bet with much faster memory you could have some decent performance out of it
 
I have no doubt whatsoever that MS is going with another IBM CPU, for backwards compatibility, plus IBM can design it to Microsoft's specification. There's no chance at all the next Xbox CPU is anything but IBM.

A modern core x86 can pretty easily emulate the Xbox 360's PowerPC cores these days.
 
well another thing we have to think about is time scheduling. First they have to pick the parts, then create the SDK kits for them then get those kits to developers so that the consoles have games at launch. when you consider how long this cycle will take you'll come to realize that the tech in the consoles will not be related to anything modern that will be out for PC around its release.
Ex: a console that is scheduled for release in 2013 will more than likely have Tech chosen for it back in 2010-2011. during the course of 2011-2012 the hardware and the SDK kits are being developed. when the SDKs are done they get sent out to the devs to get games made within the next year or so.

Pretty much if you think about this process then you can be fairly sure that the main tech will be based on 2010-2011 designs, maybe even 2009 if you consider AMDs 5xxx series
 
well another thing we have to think about is time scheduling. First they have to pick the parts, then create the SDK kits for them then get those kits to developers so that the consoles have games at launch. when you consider how long this cycle will take you'll come to realize that the tech in the consoles will not be related to anything modern that will be out for PC around its release.
Ex: a console that is scheduled for release in 2013 will more than likely have Tech chosen for it back in 2010-2011. during the course of 2011-2012 the hardware and the SDK kits are being developed. when the SDKs are done they get sent out to the devs to get games made within the next year or so.

Pretty much if you think about this process then you can be fairly sure that the main tech will be based on 2010-2011 designs, maybe even 2009 if you consider AMDs 5xxx series

Yet every other console generation we have seen shows thats not true.
 
well the consoles of back then didnt have to develop as long. ps2 was announced in 2000 and released the same year but only because all its parts were in house made. but xbox and GC were releases almost 1.5 years after their announcements for the very reasons i mentioned.

X360 didn't take too long to come out but it was supposedly in development since early 2004 and released in november of 2005 this could be a time table of 12-18 months. its SDK was just a modified version of a DX API but its hardware probably took maybe 8 months to develop. The PS3 was also announced in the 2004 E3 and since all of its parts were made from scratch, cell being the most time consuming thing to develop since early models were melting from heat issues.

pretty much you can get the picture. what i'm trying to say is that they cant just wait till the last minute to pick out the hardware they want to use and get the console made and its SDK developed and the games made. it takes time and a lot of planning ahead. As for obvious reasons of why i put such exaggerated time tables on my first post is that as time goes by the tech takes slightly longer to develop so console development time also increases. The ps4, X720 and the WiiU may have been announced this year but how long before that announcement have they been in development. do you really believe them when they say they have no plans on a next gen console, when the only reason to say that would be to make sure they don't get the competition on the nextgen wagon as well?
 
Never worked in project management have you?

Its my job actually, but thats besides the point.

The Xbox had NV30, the first "Programmable Shader" GPU, before Desktops. The Xbox 360 had Xenos, the first unified shader design GPU, before Desktops. Now none of these were as fully featured or powerful as their desktop counterparts ended up being, but they were still highly advanced and powerful. I dont understand why people expect Microsoft to suddenly use old tech. if anything I think we will see Graphics Core Next in the 720 if anything.
 
well here is another major difference from older generations to now and that being the old generations pretty much used parts that were created from scratch as stated the PS2 was all Sony in house developed. but i suppose that if you want to be technical about it since we are using hardware that is based on current PC tech, then you could probably shorten development time by a great margin. all you would need to do is pick the generation of tech you want and simply modify it for your console purposes, an obviously much faster method that simply making them from scratch.

As for the CPU with all of today's top of the line CPUs can you really go wrong with any of them? aside from the ones that suck at energy efficiency. thank god we didn't get a Pentium put in the x360 or PS3. so yeah as an amendment to my previous post it really depends on which development route they take. They can recycle current tech from the PC or they can create it from scratch, one will obviously take longer than the other.
 
Well, if you think about the PS3 and 360, both consoles used a GPU that was relatively new at that time.

The 360 with the X1900/X1950-based GPU and the PS3 with the GTX 7800-based GPU. This I would assume also added to its steep cost of both systems.

Let's take a look at the 360 first. Work on the 360 first began in 2003, a good two years before its launch on November 22, 2005. The first R520 (X1000-series) GPU was released October 2005 if going by Wikipedia's charts.

Software planning which eventually becomes the Xbox Live software and Dashboard started in February 2003. By August 2003, ATI signs on to develop the GPU for the new console. If we were to assume all GPUs that are created usually begin design stages about one to two years before launch, I would guess that work on the R520 series didn't start until late 2002 or early 2003.

Now, fast forward to 2011 and rumors of a new console start to surface. Wii U has already shown its cards-- Radeon 4000-series based GPU. This GPU was first released in 2008 and most likely first designed around 2007. However, unlike the 360/PS3 consoles using off-the-shelf GPUs that were first released, Nintendo seems to have taken a wait and see approach probably hoping the price of a GPU would drop.

The 4600-series can be had for less than $90 to $100 depending on where you shop. The 4700-series around there. Also, heat and power issues come into play. You have to admit that the first 360 and PS3 consoles were rather big and hot. They weren't optimized further until a few years later after launch. The 360 most especially because of overheating issues with the PS3 not suffering as much as the 360 did.

So, the question to ask is: Would you rather see the 360/PS3 use the same off-the-shelf GPU of the current generation? More specifically the Radeon 6000-series or Nvidia 500-series. Or, would you want something less prone to overheating and is more power efficient?

I believe Nintendo took that approach and waited for the most affordable and power efficient GPU to use in the Wii U. My next belief is that both Microsoft and Sony took a wait and see approach to see what Nintendo was planning to release first before they decided on their next consoles. Also, you have to remember Microsoft did state about a year ago that the Xbox 360 is more than sufficient for what it wants to do with the console. They also stated that they can still expand the 360's features more through new media streaming services and DLCs, and the Kinect. As for Sony, they too believe that the PS3 was more than enough and didn't see a need to work on a new console.

I'm sure that the Wii U changed their minds seeing that the GPU is going to be more powerful than either console if going by fillrate and shader counts of the R700.

Using the time schedule of the 360 as an example, I would bet that work on the next consoles from both Sony and MS probably started around 2011 before or after Nintendo's announcement of the Wii U; or around late 2010 on a "wait and see approach" first.

If the 360 began in the planning stages about two years before launch, that would put the successor at 2013 launch table. An announcement will probably not be until mid- to late next year 2012 at the earliest. The same should apply to Sony's next console.

Given all of that, both Sony and Microsoft will either use the following if they are indeed going to use AMD's GPU:
  • Radeon 5000-series.

    GPU prices of the "Evergreen" series should be relatively affordable for use in either of the new consoles. Power and heat issues will be worked out and it will make the console hopefully have a cheaper cost of entry than the 360 or PS3 did before. It also gives both Microsoft and Sony enough time to begin software development on it as well as optimizing and modifying it for console use.

    Best bet: Radeon 5700-series at least. That GPU is equivalent to a 4870 in terms of performance. If going by the Wii U and it possibly using a 4600 or 4700-series, that would put the 360's and PS3's successor as more powerful.

    ____
  • Radeon 6000-series.

    Again, prices should still be relatively affordable and will have better power and heat output than the previous 5000-series. Now, if Microsoft wants to continue backwards compatibility with the 360 and not repeat mistakes with the Xbox's emulation issues, I'm sure they will continue to use an IBM PowerPC CPU again. It's too risky to alienate current 360 users by giving them a half-assed and poorly performing system with 360 games and software.

    Therefore, if Microsoft follows in the same steps as it did with the 360 S, they would use a similar combined CPU+GPU on the same die as the XCGPU in the 360 S. That same CPU-GPU is manufactured by GlobalFoundries, which is partly owned by AMD. This again leads to the rumor that they may use an APU offered by them. However, it wouldn't go past me that they integrate a PPC and Radeon 6600, 6700 or even 6800-series GPU on the same die as the CPU. This will reduce costs to the console itself as well as make it more power efficient.

    Another fact that will lead more credence to the rumors is UVD 3 and HDMI 1.4a. UVD 3 allows two simultaneous 1080p streams and HDMI 1.4a allows 3D video over HDMI. The 6770 is still a 5700-series but with UVD 3 added onto it along with HDMI 1.4a compatibility. Both the 6600 and 6800 series make even more sense since they're based on a reworked architecture of the 5000-series-- faster and more power efficient.

    Sony may also take an integrated approach. The last rumor I read for the next CPU was something with 10 or 12-cores. The GPU I haven't read much into what they will use other than that it's possibly coming from AMD and no longer from Nvidia.

    Best bet: Radeon 6700 or 6800 series on-die similar to AMD's APU with the 6600-series on the same die as the CPU. If they went with the 6770, it'll be nothing more than a reworked 5700-series with UVD 3 and HDMI 1.4a. If both Sony and Microsoft went with the 6800-series, that would crush Wii U by a wide margin in terms of graphics horsepower.

    _____
  • Radeon 7000-series.

    The problem is that this hasn't been released yet. Also, being that it will be a new GPU would put the next consoles from Sony and Microsoft in the same position as it did before in 2005-- expensive and high-priced. Also, Graphics Core Next is only available in the 7900-series. VLIW4, if going by rumors around the internet, is only on the 7800-series and lower. So the chances this will be in the next consoles is highly unlikely especially for the fact it may significantly increase costs of the console at launch.

    Also, if they plan on showing any hardware at the next E3 or GDC or other convention, then the 7000-series would not be it. It's too new, but it would give both Sony and Microsoft a significant lead ahead of Nintendo. That will probably (or most definitely) eliminate Nintendo as competition as well due to graphics horsepower alone. (Of course, the same would apply to either the 5000- or 6000-series also.)

    Best bet: Unreleased Radeon 7500/7600 ("Lombok") series using VLIW4 instead of the VLIW5 found in the 4000, 5000, and 6700 series GPU. This would be more power efficient than what it was in the 6000-series due to a smaller die-- 28 nm . That would make both Sony's and Microsoft's next system more powerful than the Wii U and again direct competitors to each other in high definition gaming and 3D video.

I honestly don't see "Bulldozer" possibly being used given that the CPU is x86-based. Both the 360 and PS3 CPUs are PPC-based. Emulating it would be detrimental to performance if they have any intention of providing backwards compatibility. AMD's APU architecture will most likely be used as a basis for the next CPU-GPU that may be used in either of the new consoles. This should help with software development costs (i.e.- no development of software emulation), backwards compatibility, and power efficiency. That way transitioning from one console to the other will not be steep. It'd be like upgrading to a newer, more powerful computer but still using the same familiar interface and software.

Lastly, keep in mind that this gives Microsoft enough time to modify DirectX 11 API for use in the next console as they did with DirectX 9 for the 360. Sony will possibly use all of the features of OpenGL 4.1 or create an API based on it, which to my knowledge has just about the same features as DX11. The Radeon 5000, 6000 and 7000 series are DX11 and OpenGL 4.1 compatible. You can just imagine what might be possible in the next console games about two years from now (at the earliest) when comparing them to what the ATI X1900 and GTX 7800 put out for the last 5 years in their respective consoles.

Of course, all of this is assumption and theory.
 
Last edited:

The 360 does NOT use a X1900 derived GPU, it uses many of the X1900 design but also uses unified shaders such as those in the R600, making it a good bit more advanced than a standard R520 derived core (infact unified shaders make it much more like a R600).
 
The 360 does NOT use a X1900 derived GPU, it uses many of the X1900 design but also uses unified shaders such as those in the R600, making it a good bit more advanced than a standard R520 derived core (infact unified shaders make it much more like a R600).

That's true. A lot of what went into the 360's GPU design was later used in the R600 later on.
 
Seems my estimated timetable for the launch of the successor of the 360 and PS3 consoles was correct-- 2013.

"Xbox Next"
http://www.develop-online.net/news/38917/Sources-point-to-2013-launch-for-Xbox-Next

Playstation 4
http://www.develop-online.net/news/38916/Sony-studios-begin-PlayStation-4-projects

Is this a hint that the 360 is using Graphics Core Next? o_O If true, that'll completely change my assumptions I made above. It also gives enough time for that GPU to drop in price and get optimized for console use.
 
well here is another major difference from older generations to now and that being the old generations pretty much used parts that were created from scratch as stated the PS2 was all Sony in house developed.
Xbox used NV30 and was from PS2's generation. That is quite a while ago now. N64's GPU was licensed from and designed by the SGI guys.
 
Back
Top