October Nexus Prime rumors

This is how I understood it as well, the 4S has two SGX540s in a dual core configuration.

Also, the SGX543MP2 in the 4S is only clocked at 200MHz.

This brings up the age-old question. Would you rather have a single chip at 400MHz, or dual chips at 200MHz? While it's 384 instead of 400, I believe the same logic applies.

I would choose the 400MHz, assuming the architecture scales with clock speed.

Well, what we'd truly need to know is the amount of power it takes to run an SGX543MP2 at 200 MHz vs an SGX540 at 384 MHz.

What I don't know is how the SGX543MP2 is able to perform about twice as well as the SGX540 at half the clock speed and subject to the same bandwidth constraints. It may be that the SGX543MP2 is able to process a tile (remember that the SGX5xx are TBDRs, not IMRs like Qualcomm and NVIDIA's GPUs) with each GPU simultaneously while accessing separate memory memory channels to alleviate bandwidth issues.

But this is just some guesswork on my part.
 
What I don't know is how the SGX543MP2 is able to perform about twice as well as the SGX540 at half the clock speed and subject to the same bandwidth constraints.
My guess? The drivers. Apple has one GPU that they can invest in with all their developers and optimize the crap out of the driver. I would think that the software framebuffer in Android might be a hindrance as well. You'll just have to wait and see what ICS can do for these.

*edit* According to that article, I guess that handling 4 times the number of instructions at the same clock can do wonders, too. :)
 
Last edited:
I think the main thing everyone is upset about is the fact that a so-called flagship phone is about to release with an 18 month old GPU. I don't care if it's been upclocked or not. It's an old architecture.

Personally I've decided to pass on this one. ICS via CM9 will hold me over on my Evo until the next gen SoCs come to market next year.
 
I think the main thing everyone is upset about is the fact that a so-called flagship phone is about to release with an 18 month old GPU. I don't care if it's been upclocked or not. It's an old architecture.

Personally I've decided to pass on this one. ICS via CM9 will hold me over on my Evo until the next gen SoCs come to market next year.

You're still not getting it. It's a GPU that was ahead of its time, more powerful than the memory bandwidth of the phones can handle, and just now it's being used to its potential. And it actually uses a TBDR system instead of an IMR like the other mobile GPUs which is actually a rendering system that is better suited for mobile devices.

Give this a read if you think that the SGX540 is an "old architecture": http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/15

The fact is that Imagination Technologies has been in this game for years; Qualcomm and NVIDIA are playing catch-up. NVIDIA needs time to refine its ULP GeForce, and Qualcomm's Adreno is largely unchanged from the Imageon it was back in 2008 when Qualcomm bought ATI's mobile division of the same name. And you're worried about 18 months! Fortunately, we'll see some of the fruits of their labors in the Krait and Kal-El cores coming within the next 6 months or so.

Personally, I'm probably going to do the same thing though... if I see CM9 ported to my Epic 4G by this holiday season, I'll hold out for the next Google phone with a Cortex-A15. I'm not unhappy with my hardware at this point, I just wish I wasn't beholden to Samsung and Sprint for updates.
 
Last edited:
People talking about how outdated and slow the SGX540 GPU is don't really know what they're talking about.

The SGX540 is a very powerful GPU that has been bottlenecked by SoCs with crappy slow LPDDR1 single-channel memory for the 18 months it has been in production. Recently it's been added to Cortex-A9 SoCs that use dual-channel LPDDR2 with dual memory controllers.

Smartphone GPUs are limited primarily by the speed of the phone's memory bandwidth, and generally run at a fraction of the clock speed they are capable of because of those limitations. Remove those limitations, and you can apply the clock speeds the GPU is truly capable of.

SGX540 in Hummingbird (Galaxy S): clocked at 200 MHz and tied to single-channel LPDDR1 memory
SGX540 in OMAP 4430 (Droid 3, Bionic): clocked at 304 MHz and tied to dual-channel LPDDR2 memory
SGX540 in OMAP 4460 (Galaxy Nexus): clocked at 384 MHz and tied to dual-channel LPDDR2 memory.

Result: Faster than Qualcomm's Adreno 220 and NVIDIA's Tegra 2 ULP GeForce. Slower than SGX543MP2 in iPhone 4S and Mali-400 MP4 in Exynos 4210 in Galaxy S II.

Is this only based on the GPU architecture / clock rates? Im more interested in a real life benchmark to see how it actually performs. Ive seen apple products perform better with mediocre hardware compared these android's newer/better Specs.

with the weak GPU to pair up with the 720 qHD screen Id rather have a low resolution SG2. so what do you think guys? SG2 or wait for Galaxy Nexus(in AT&T)
 
Is this only based on the GPU architecture / clock rates? Im more interested in a real life benchmark to see how it actually performs. Ive seen apple products perform better with mediocre hardware compared these android's newer/better Specs.

with the weak GPU to pair up with the 720 qHD screen Id rather have a low resolution SG2. so what do you think guys? SG2 or wait for Galaxy Nexus(in AT&T)

I give up with you guys. Just wait until the freakin' benchmarks come out. You're not going to see a Qualcomm or NVIDIA GPU that'll perform better than the Galaxy Nexus until Kal-El later this holiday season or Krait some time next year.

If GPU benchmark scores are that important to you go get a SGS2. Real-world you'll see virtually no difference. Unless we see some games with very high poly counts hit Android, and then you'll actually see the SGX540 in the Galaxy Nexus start beating the Mali-400 MP4 in the SGS2.
 
Last edited:
I give up with you guys. Just wait until the freakin' benchmarks come out. You're not going to see a Qualcomm or NVIDIA GPU that'll perform better than the Galaxy Nexus until Kal-El later this holiday season or Krait some time next year.

If GPU benchmark scores are that important to you go get a SGS2. Real-world you'll see virtually no difference. Unless we see some games with very high poly counts hit Android, and then you'll actually see the SGX540 in the Galaxy Nexus start beating the Mali-400 MP4 in the SGS2.

In theory, high poly count would benefit the Adreno Arch even more. It's older Adreno 205 is already kicking the SGX540 in any vertex related benchmark, losing out in pixel shading benches a little less than 2:1. Now the Adreno 220 is about twice as fast... and the Adrenos were already designed and benched on a Single channel memory arch :p (supposed to be dual channel SoC, however, the second IC is off package, so no one, not even Qualcomm's MDP (supposedly) did so. However, the Krait's support PoP dual channel, so all moving forward should be dual channel).

Of course, that was against the old SGX540 which runs at 80% of the "new" SGX540's clocks, so we'll see if the performance really matches up sooner or later.
 
In theory, high poly count would benefit the Adreno Arch even more. It's older Adreno 205 is already kicking the SGX540 in any vertex related benchmark, losing out in pixel shading benches a little less than 2:1. Now the Adreno 220 is about twice as fast... and the Adrenos were already designed and benched on a Single channel memory arch :p (supposed to be dual channel SoC, however, the second IC is off package, so no one, not even Qualcomm's MDP (supposedly) did so. However, the Krait's support PoP dual channel, so all moving forward should be dual channel).

Of course, that was against the old SGX540 which runs at 80% of the "new" SGX540's clocks, so we'll see if the performance really matches up sooner or later.

Well, from what I've seen in AnandTech's Galaxy S II review, the SGX540 at 300 MHz is pretty handily beating Adreno 220 in triangle througput, even when the Adreno 220 is running on a higher-res qHD display, while the SGX540 at 200 MHz does do pretty miserably.

So it's obvious that the vertex performance in the SGX540 scales very well with clock speed and memory bandwidth improvements. That said, Adreno 220 also only runs at 266 MHz, and the Adreno 225 is supposed to be on par with the SGX543MP2 in performance at 400 MHz, which is a bold claim.

First-gen Krait SoCs (some time early next year) will have Adreno 225 on board, and if it holds true to its claims, it'll be a powerful GPU, even if it isn't really architecturally different than the current Adreno 220. Subsequent Kraits will be Adreno 3xx though, the first major architectural change to the GPU since Qualcomm bought the Adreno design in 2008. That's a GPU I'll be very interested in seeing!
 
^^ Ah, I see, thank you! :)

I never realized the SGX540 was clocked as low as 200MHz before, I had assumed all ran at 300/384MHz, :eek:
 
^^ Ah, I see, thank you! :)

I never realized the SGX540 was clocked as low as 200MHz before, I had assumed all ran at 300/384MHz, :eek:

Yep, my Epic 4G (Galaxy S) SGX540 is clocked at 200 MHz. I'm trying to figure out where the 90 Mtps that Samsung promised me are hiding... the Nexus S (which also runs an SGX540 at 200 MHz) in that test I linked doesn't even manage 9.

I wrote a review about a year and a half ago where I tried to work out how it would be possible to get that kind of throughput via single-channel LPDDR1 at 200 MHz. I couldn't figure it out, and made all kinds of theories as to how it was being done. I suppose if I'd known more about TBDRs then, I would have understood. (Tile-based deferred renderers like the SGX540 use a lot less memory bandwidth than the traditional IMRs , or even those IMRs that implement scene tiling or early-z, like other mobile GPUs)

That said, I'm still not seeing anywhere near Samsung's claimed 90 Mtps. The SGX540 is rated at 28 Mtps at 200 MHz which we're not seeing real-world either. :confused:
 
Last edited:
Software? I know the Adrenos are somewhat related to AMD's VLIW5 arch, so software could definately change things up there (also CPU speed).

EDIT: that, and the SGX are very powerful Tile based renderers (good readup on that: MS Tailsman).
 
Software? I know the Adrenos are somewhat related to AMD's VLIW5 arch, so software could definately change things up there (also CPU speed).

EDIT: that, and the SGX are very powerful Tile based renderers (good readup on that: MS Tailsman).

I'll check it out, thanks. I actually added the part about TBDRs to my post before I noticed you had replied. I have a bad habit of going back and adding info to my post about 20 times.

Also, I've seen your name in comments on AnandTech articles; it's good to talk with someone who can contribute to intelligent discussion about ARM hardware.
 
I am interested to see how the IVA3 is used. It supposedly is a native hardware decoder for H.264, MPEG4, MPEG2 video and some popular audio codecs. The interesting bit is it also has a programmable DSP in it to make it future compatible with other codecs. I couldn't find any information on how you program that DSP though. I would love to see some video players take advantage of that hardware decoder to get some nice power efficient video playback.
 
The screen of Nexus Prime sucks. They "cheated" the resolution by using PenTile subpixel layout.
 
The screen of Nexus Prime sucks. They "cheated" the resolution by using PenTile subpixel layout.

Plenty of Samsung's AMOLED screens use Pentile, only the Plus ones do not. I don't see how it's that big of an issue though.
 
Plenty of Samsung's AMOLED screens use Pentile, only the Plus ones do not. I don't see how it's that big of an issue though.

With subpixel rendering, I could technically count my 2560x1440 desktop display as a 7680x1440 display :D

It's basically bullshitting the specs, IMO. Even if it's using the newer RGBW (which I seriously doubt Samsung is) variant of Pentile, it's still not much more than a subpixel rendered display that counts subpixel pairs (which contain only two colors each) rather than the proper RGB triplets as a "pixel." In the older Pentile layout, it really showed itself in vertical GUI elements and in text. The current form only really shows out in motion elements, where Pentile's subpixel rendering hasn't really figured a way to "fix" that yet (though they should, eventually).

In easier terms, 320PPI Pentile < 320PPI RGB Stripe.

And before certain people get fussy over Nokia's "AMOLED" not having a "Plus" at the end, only Samsung, one of two major AMOLED suppliers (other is Chi Mei EL) uses that naming convention. Nokia doesn't source their panels from Samsung.
 
With subpixel rendering, I could technically count my 2560x1440 desktop display as a 7680x1440 display :D

It's basically bullshitting the specs, IMO. Even if it's using the newer RGBW (which I seriously doubt Samsung is) variant of Pentile, it's still not much more than a subpixel rendered display that counts subpixel pairs (which contain only two colors each) rather than the proper RGB triplets as a "pixel." In the older Pentile layout, it really showed itself in vertical GUI elements and in text. The current form only really shows out in motion elements, where Pentile's subpixel rendering hasn't really figured a way to "fix" that yet (though they should, eventually).

In easier terms, 320PPI Pentile < 320PPI RGB Stripe.

And before certain people get fussy over Nokia's "AMOLED" not having a "Plus" at the end, only Samsung, one of two major AMOLED suppliers (other is Chi Mei EL) uses that naming convention. Nokia doesn't source their panels from Samsung.

It is still a more dense array of sub pixels than the GS2 which is a true RGB stripe.
 
It is still a more dense array of sub pixels than the GS2 which is a true RGB stripe.

GN= 571 subPPI 1280x720 4.65"

GSII = 681 subPPI 960x540 4.3"


just saying. Remember, Pentile counts each subpixel pair as a "pixel," while it takes 3 for a RGB stripe.
 
Friends of mine who got to use the phone in HK were very disappointed in the screen. Just saying.
 
I don't think I can get this question answered pre-release, but...

I currently have a HTC Incredible. I have a handful of disappointments, all are with the hardware, none are with the CM7.1 it's running. :)

How bright is this screen going to be? Visible in daylight no problem? I'm looking for something equal to my ASUS Transformer or better.
 
GN= 571 subPPI 1280x720 4.65"

GSII = 681 subPPI 960x540 4.3"


just saying. Remember, Pentile counts each subpixel pair as a "pixel," while it takes 3 for a RGB stripe.

GSII is 800x480, just sayin....

4.3" GSII = 216 PPI x 3 sub pixels per pixel = 648 subPPI
4.5" GSII = 207 PPI x 3 sub pixels per pixel = 622 subPPI
4.65" GN = 316 PPI x 2 sub pixels per pixel = 631 subPPI

Remember, math matters...
 
Last edited:
GSII is 800x480, just sayin....

4.3" GSII = 216 PPI x 3 sub pixels per pixel = 648 subPPI
4.5" GSII = 207 PPI x 3 sub pixels per pixel = 622 subPPI
4.65" GN = 316 PPI x 2 sub pixels per pixel = 631 subPPI

Remember, math matters...

Oops, I forgot the GSII was 800x480 :p

The GSII is still higher PPI, even on the subpixel level, than the GN. Well, unless if you want to count the fat american GSII :D (in which case, the HTC Titan is not that far off :eek:)

Besides...

4.3" 800x480 is 216.9 PPI while 4.65 1280x720 is 315.8 PPI. Yet they are rounded differently (probably wiki specs?)
 
Honestly, as soon as the Galaxy Nexus shows up on Sprint, I'm going to walk into a Sprint store, ask them to hand me an Epic 4G Touch, a Galaxy Nexus, and let my eyes do the deciding.

I'm guessing that PenTile won't mean much at such high resolution, screen size aside. Hell, I hardly notice PenTile on my Galaxy S (Epic 4G).
 
Previous message edited to be WAY less insulting, imma look like a conspiracy theorist before long :eek: thinkin' everyone is out to git me an stuff...


EDIT: and I am the unlucky owner of an Atrix and a Sammy Focus. Both types of Pentile, both at 4" :p
 
Honestly, as soon as the Galaxy Nexus shows up on Sprint, I'm going to walk into a Sprint store, ask them to hand me an Epic 4G Touch, a Galaxy Nexus, and let my eyes do the deciding.

I'm guessing that PenTile won't mean much at such high resolution, screen size aside. Hell, I hardly notice PenTile on my Galaxy S (Epic 4G).

My Incredible is a PenTile as well, and I don't really notice it to be honest. My only complaint is the screen is not bright enough at full brightness.

With the kind of math I see being thrown around here, it's basically saying that my Inc with a 3.7" PenTile at 800x480 would be a higher PPI than the Galaxy Nexus, and therefore the Nexus' display would be inferior.

Maybe I'll have to see it to believe it, but I have a hard time thinking that this screen is going to be worse than what I have now.
 
My Incredible is a PenTile as well, and I don't really notice it to be honest. My only complaint is the screen is not bright enough at full brightness.

With the kind of math I see being thrown around here, it's basically saying that my Inc with a 3.7" PenTile at 800x480 would be a higher PPI than the Galaxy Nexus, and therefore the Nexus' display would be inferior.

Maybe I'll have to see it to believe it, but I have a hard time thinking that this screen is going to be worse than what I have now.

lol, many different kinds of Pentile float around, now o.0

Would the Nexus display be inferior? Of course not! However, it's not the gods gift to mankind that some would like to believe it is, especially since it's pentile. My main point in bringing that up was to point out a 4.3" RGB stripe display at 960x540 (and apparently 800x480), was at slightly higher subpixel PPI vs a Pentile (either the RGBG with the compressed greens, or the RGBW with the square elements), so the overall display sharpness of 1280x720 on a 4.65 display was slightly lost because it was pentile.
 
I think the main thing everyone is upset about is the fact that a so-called flagship phone is about to release with an 18 month old GPU. I don't care if it's been upclocked or not. It's an old architecture.

Personally I've decided to pass on this one. ICS via CM9 will hold me over on my Evo until the next gen SoCs come to market next year.
I'm starting to think the same thing, with the caveat of wanting a cheaper plan ASAP. Maybe a galSII. Not sure yet.
 
looks like I'll be sticking with my HTC Incredible till something worthwhile comes along
 
Hopefully I can snag one from my friends before launch and do a little comparison to the 4S and then make up my mind.
 
lol, many different kinds of Pentile float around, now o.0

Would the Nexus display be inferior? Of course not! However, it's not the gods gift to mankind that some would like to believe it is, especially since it's pentile. My main point in bringing that up was to point out a 4.3" RGB stripe display at 960x540 (and apparently 800x480), was at slightly higher subpixel PPI vs a Pentile (either the RGBG with the compressed greens, or the RGBW with the square elements), so the overall display sharpness of 1280x720 on a 4.65 display was slightly lost because it was pentile.

I agree. I think that a decent amount of people on technology forums have unrealistic expectations of the next thing and end up disappointed every time. There is no holy grail of phones because in the end, they're just phones. No matter what phone you decide on in the end, it is a compromise in some way, shape or form.

Sure, everyone would like a phone with the build quality of the RAZR with a removable battery, 5" sAMOLED+ 720p screen from edge-to-edge, easy to root like the Nexus, vanilla Android like the Nexus, and has both a microHDMI port and a USB port with an adapter to plug a flash drive or USB controller into. It doesn't exist and it's not going to exist.

I think the actual frustration over the pentile screen isn't just because it's pentile, it's because the technology is there to make a small sAMOLED 720p screen. The Galaxy Note is coming out soon. Edit: I thought the note wasn't PenTile, my bad.
In that case, there's a good chance we're looking at a year until we have 720p phones without pentile.

For me, it's really between this and whether I want to wait for the next Google phone, assuming it even makes it to Verizon. I won't be switching because I am completely satisfied with the network and I have a decent discount on service. I'm not really that interested in spending another year with my Inc.

Hopefully I can snag one from my friends before launch and do a little comparison to the 4S and then make up my mind.

I would be more open to the 4S if it had a bigger screen and an ability to easily backup/restore/sync my text messages and call log like I can with Android. IOS5 is definitely a step in the right direction to me, but there are a slew of apps to do such a basic feature in many different ways on Android yet it seems like pulling teeth on iOS.
 
Last edited:
I'll probably end up getting a galaxy nexus. I'm not exactly thrilled with it, but I can't stand my OG droid anymore and I don't want to hold out for who knows how much longer. It's true with all technology. There is always something better coming, but you can't wait forever.
 
I'll probably end up getting a galaxy nexus. I'm not exactly thrilled with it, but I can't stand my OG droid anymore and I don't want to hold out for who knows how much longer. It's true with all technology. There is always something better coming, but you can't wait forever.
I think I'm going to go with it as well. I've had enough with problems on phones that manufacturers refuse to update. It will be nice to have a somewhat future proof phone this time around. Hopefully Tmobile gets it sooner rather than later.
 
Well, what we'd truly need to know is the amount of power it takes to run an SGX543MP2 at 200 MHz vs an SGX540 at 384 MHz.

What I don't know is how the SGX543MP2 is able to perform about twice as well as the SGX540 at half the clock speed and subject to the same bandwidth constraints. It may be that the SGX543MP2 is able to process a tile (remember that the SGX5xx are TBDRs, not IMRs like Qualcomm and NVIDIA's GPUs) with each GPU simultaneously while accessing separate memory memory channels to alleviate bandwidth issues.

But this is just some guesswork on my part.

The Droid 3 review from Anand has a simple but useful GPU chart.

It shows that even a single SGX543 has more computational power than a single SGX540, which suggests that it's unlikely that the entire lead the 4S has is from software.

Even if we just did a quick assumption of the SGX540 working at 400MHz, we'd be left with 6.4 GFLOPS, so it's basically equivalent to a single SGX543 at 200MHz.

Taking a peek at the 4S review on Anand shows us numbers for GL Pro like:

SGX543MP2 - 122.7
SGX540 @ 300MHz - 42.6

So even if we throw out an estimate of what it would be a 400MHz, based on an increase of 1.6 GFLOPS, and increase that by 25% we're at 53.25 which is not even half. Even if we assumed a best case scenario for a performance improvement from ICS, we'd be looking at low 60s.

While I understand the [H] logic of benchmarking and real world gameplay, this is such a huge disparity that all of that is irrelevant.

To be honest, I feel like the hardware manufacturers are not taking Android seriously as a mobile gaming platform. If the hardware isn't there, then I doubt software developers will as well.

It's literally coming down to making me choose between OS functions which I would greatly miss, and future gaming performance.

My Incredible couldn't play a game released at the end of last year, Dungeon Defenders. I don't want to be unable to play games released in the same year of my phone, but I don't want to lose the things I love about Android either.
 
I suggest to all people that prefers RGB stripe on phone with higher DPI to read this:
http://pentileblog.com/

I like this line:

"I never ceases to amaze me what people can become used to. One example would be orthodonture. How can a mouth full of brackets and wires be tolerated by so many teenagers? What begins as discomfort soon become part of the background experience that is barely noticed."

Basically, we have to put up with Pentile (TM) so they can "correct" our eyes to fit their artificial, poorly rendered world. Reguardless of how Nouvoyance (which writes and owns this "blog," btw) claims it, it's still fewer sub pixels per pixel vs RGB stripe, and fewer colors per pixel as a direct result, too.

And incase if you really didn't know, Nouvoyance is owned by Samsung.

Pentile has one good aspect: it saves battery. Why? Because it's a low resolution display propped up as a high red display. Fewer subpixels pixels per inch = larger subpixel elements = less light impedance.

IF pentile was so great, Samsung would stick to it, and not go ahead and produce a 4.3" 800x480 RGB stripe AMOLED display for use on the Galaxy SII (after creating a smaller Pentile display for the Galaxy S & Nexus S). One part of the Pentile corporate "blog" says it best. They claimed it was the only way to create the Galaxy Nexus display, stating a 720p res display would have to be much larger (by "one third larger diagonal"), while ignoring an already higher subpixel/inch display already made by Samsung, their parent company. Not just a "corner case, niche market," display. One on their (currently selling) flagship, the Galaxy SII. So AMOLED subpixel element size (increasing their longevity in theory), clearly isn't the issue here...
 
jeremy you don't understood the readings.
GS2 can't use a pentile matrix since it has too few DPI.

pentile is good only on high dpi displays, and on that monitor is better than RGB striped because it saves battery and creates sharper images.

light-text.png


please see this image, pentile is clearly worse.
now go far from your monitor at the point that you are not able to see the matrix (this to reproduce a real life scenario)

what is the sharper image?

PENTILE give you the sharper images ;)
 
So AMOLED subpixel element size (increasing their longevity in theory), clearly isn't the issue here...

I keep seeing this, and I'm not sure if I really understand. Why does a reduced subpixel size have a negative impact on lifespan? :confused:
 
Back
Top