Apple going full Polaris?

Yakk

Supreme [H]ardness
Joined
Nov 5, 2010
Messages
5,810
It is not surprising that Apple is continuing to stick with AMD. AMD has supplied higher end GPU's to Apple for several generations now. Both due to OpenCL performance and more importantly price.

I think it has little to do with efficiency since the M370 and M380 were not as efficient as NV counterparts.

Apple likes the products AMD comes out with primarily due to price. AMD offers their GPU's for a substantially lower price than NV. It is also why Apple has been looking at Zen instead of Intel. It is all about margins.
 
That's true that they like the price, but Apple is really good at stuffing a PC into as small a form factor as possible. This fits their mold perfectly.
 
It is also why Apple has been looking at Zen instead of Intel. It is all about margins.

wikipedian_protester.png
 
Apple likes the products AMD comes out with primarily due to price. AMD offers their GPU's for a substantially lower price than NV. It is also why Apple has been looking at Zen instead of Intel. It is all about margins.

Somehow I doubt that very much. If you look at the percentage of mark up on Apple hardware the only thing it does not scream is that they are getting low margins on their sales. To boast this as another move to improve margins is very doubtful. The margins for Apple on their hardware are already at the status of "fucking amazing" they are going for "orgasmic"?

Look at what Intel does they sell so many CPU they went from soldered lids to cheap goo and their durable sturdy lids to shabby ones that will break when you got a large/tower HSF and they do these things because they sell a lot of volume on these parts.
 
Somehow I doubt that very much. If you look at the percentage of mark up on Apple hardware the only thing it does not scream is that they are getting low margins on their sales. To boast this as another move to improve margins is very doubtful. The margins for Apple on their hardware are already at the status of "fucking amazing" they are going for "orgasmic"?

Look at what Intel does they sell so many CPU they went from soldered lids to cheap goo and their durable sturdy lids to shabby ones that will break when you got a large/tower HSF and they do these things because they sell a lot of volume on these parts.


What I think is that apple didn't like nvidia issues, that they had to recall some products if I'm not mistaken a few years ago. Since then they started to look more seriously to AMD.
 
you NEVER go full Polaris!
hehe anyways... As much as a hate apple, I hope AMD does get more of its products into everything and everywhere they can. I want to see them turn things around!
 
I don't see Apple for games, more the professional content creation market. Of course Apple wants to squeeze the most most money out of their suppliers, but in the case of AMD it also helps them out big time in content creation. Ever since the Mac PRO have had Radeon card, companies like Adobe naturally started to migrate their products to open standards for the GPU. It's actually a good deal for both companies IMO.
 

Rumor: AMD Making Custom x86 SOC For Apple's 2017 And 2018 iMac Designs - Nintendo NX Likely Powered By AMD As Well
Apple to use AMD Zen SoCs for future iMacs?
http://seekingalpha.com/article/3582936-amd-inside-apple-can-beat-intel-inside

And a bunch more if you would google. Even articles back to 2010 point to Apple trying to ditch Intel due to pricing.

It is all dependent on AMD performance per dollar getting near Intel. Until now Intel's performance has been much higher for Apple to switch.
 
To boast this as another move to improve margins is very doubtful.
Being a year or two off, it might be a small form factor where it's required as opposed to just margins. It may still lower costs just because it's tiny and there are less materials.
 
Rumor: AMD Making Custom x86 SOC For Apple's 2017 And 2018 iMac Designs - Nintendo NX Likely Powered By AMD As Well
Apple to use AMD Zen SoCs for future iMacs?
http://seekingalpha.com/article/3582936-amd-inside-apple-can-beat-intel-inside

And a bunch more if you would google. Even articles back to 2010 point to Apple trying to ditch Intel due to pricing.

It is all dependent on AMD performance per dollar getting near Intel. Until now Intel's performance has been much higher for Apple to switch.

And performance/watt.

You can't expect them to put more than a 150w processor inside that trash can, do you? Apple's current top-end Xeon product from Intel has a 12-core for 130w based on old 22nm Ivy-EP.

Their closest replacement from Broadwell-EP has 16 cores at 145w for nearly the same tray price.

Intel® Xeon® Processor E5-2697A v4 (40M Cache, 2.60 GHz) Specifications

So AMD has to offer 16 cores at the same 145w TDP at reasonable core clock to even be in the running. I can't see them charging significantly less for such a complex processor, one they're betting the company on. The 8-core versions will sell for mainstream prices, but the multi-chip modules won't.

Just because you're being CONSIDERED for a computer design win doesn't mean you HAVE the design win. But it is impressive to see someone actually care about AMD's high-end chips again. That hasn't happened since the Phenom II.
 
Last edited:
I don't see Apple for games, more the professional content creation market. Of course Apple wants to squeeze the most most money out of their suppliers, but in the case of AMD it also helps them out big time in content creation. Ever since the Mac PRO have had Radeon card, companies like Adobe naturally started to migrate their products to open standards for the GPU. It's actually a good deal for both companies IMO.

Be careful when you state "open standards."

Apple created OpenCL purely for the benefit of Apple. The rest of the world is just helping themselves to the open standard.

As a result, the OpenCL subsystem on OS X is completely controlled by Apple. Whereas you can load different drivers on Windows or Linux, if you have a compatibility problem, you can do nothing on OS X:

Open source devs criticise Apple’s OpenCL support | CG Channel

Sounds like they just exchanged one control freak (Nvidia hardware only) for another (one software stack to rule them all). And that's a problem for Apple since the majority of OpenCL apps were created for other platforms.

I haven't heard of any major waves of OpenCL acceleration. The latest Adobe Lightroom only supports a few GPU-accelerated options, and some of them are FASTER if you turn it off.
 
Last edited:
Na, they'll just continue to use the previous generation after the one we currently use and still price their products astronomically high.

Because after all, what does the common Apple follower know about graphics hardware anyway? "Its Apple Sir, that's all you need to know O_O".

Example
13-inch MacBook Air
 
I think AMD's main goal going forward is market share and design wins. If they can make a fast enough product that is efficient and can be better integrated in to apple design than its a win. It's a win for everyone. If Polaris is as efficient as they say it is I am not surprised apple is going for it. If it can perform like 390x staying under 150w that is pretty damn impressive.
 
Na, they'll just continue to use the previous generation after the one we currently use and still price their products astronomically high.

Because after all, what does the common Apple follower know about graphics hardware anyway? "Its Apple Sir, that's all you need to know O_O".

Example
13-inch MacBook Air

Yup, this is why when they ditched the GTX 650m, they went with the same performance as the HD 7750. You know, INSTEAD of the Maxwell GTX 860m, which has TWICE THE PERFORMANCE in the thin and light gamer (i.e. Macbook Pro Retina), and was quite common in notebooks in 2014, a year earlier than the 2015 Macbook Pro with AMD graphics was released.

That was all about money, because there was no other technical reason to exclude it. After Nvidia burned AMD's noteboook lineup with GM107, Apple accepted the fire sale.

Who cares if it's only 30% faster than the GTX 650m that once graced Apple's lineup? It's not like their users play anything demanding, or even expect games to run at native resolution anyway.
 
The bigger concern with marketshare is everything turning into a tablet, notebook, or cigar box sized PC for mainstream parts. Discrete cards likely aren't even an option. Getting a SOC with a reasonably large, low-power Polaris GPU and HBM on the package is the best solution for performance.
 
Be careful when you state "open standards."

Apple created OpenCL purely for the benefit of Apple. The rest of the world is just helping themselves to the open standard.

As a result, the OpenCL subsystem on OS X is completely controlled by Apple. Whereas you can load different drivers on Windows or Linux, if you have a compatibility problem, you can do nothing on OS X:

Open source devs criticise Apple’s OpenCL support | CG Channel

Sounds like they just exchanged one control freak (Nvidia hardware only) for another (one software stack to rule them all). And that's a problem for Apple since the majority of OpenCL apps were created for other platforms.

I haven't heard of any major waves of OpenCL acceleration. The latest Adobe Lightroom only supports a few GPU-accelerated options, and some of them are FASTER if you turn it off.

Apple heavily optimizes code libraries for their hardware configurations. It enables them to extract *a lot* more performance from their hardware configurations compared to Windows based systems, much like recompiling a kernel or like game consoles. Still, these optimizations are in part based on open standards, for a very good reason. They do this not to be vendor-locked. My guess, for example, is alarm bells probably started to go off when too much proprietary cuda code was starting to be inserted by nvidia in major Apple applications which would lessen Apple's negotiating power at contact renewals, not to mention licensing issues. Being still based on open standards gives them the option to switch vendors relatively painlessly with just a few possible updates needed later for optimizations.
 
If Zen performs I would love a full on AMD based MacBook Pro. I'd buy one in a heartbeat. I'm waiting for someone to create a nice AMD laptop, Carrizo and it's upcoming refresh could of made for a nice mid-high end UltraBook but instead OEM's decided it was better served being gimped, setup with single channel memory and stuffed into cheap plastic chasis with 10x7 screens.

Give me something like the Asus UX305 or whatever their $500-600 UltraBook was and it'd sell like crazy, imo.
 
Apple heavily optimizes code libraries for their hardware configurations. It enables them to extract *a lot* more performance from their hardware configurations compared to Windows based systems, much like recompiling a kernel or like game consoles.

Prove it.

Outrageous claims require outrageous proof. In the past, OS X has been notable for having SLOWER OpenGL performance than Windows using the exact same card and professional applications. So it's the un-optimized driver stack that OS X is widely known for.
 
Well, it might make some sense that they're pushing out the lower-end lower-power chips first if they already have a commitment from Apple to buy them.

P.S. LOL at OS X outperforming Windows at anything, it's always slower and sometimes massively slower. This is partly because Apple doesn't optimize as heavily, but it's also because of the hybrid microkernel design that Mach uses. Having to constantly communicate between the kernel and non-kernel drivers is computationally expensive. Source: every benchmark ever.
 
Prove it.

Outrageous claims require outrageous proof. In the past, OS X has been notable for having SLOWER OpenGL performance than Windows using the exact same card and professional applications. So it's the un-optimized driver stack that OS X is widely known for.

I have nothing to prove. Save some exceptions, Apple demonstrates it very consistently by keeping up performance their userbase seem satisfied with while trading lesser spec'd hardware for better power efficiency and the ability to switch vendors each refresh cycle if needed.
 
I have nothing to prove. Save some exceptions, Apple demonstrates it very consistently by keeping up performance their userbase seem satisfied with while trading lesser spec'd hardware for better power efficiency and the ability to switch vendors each refresh cycle if needed.


No they don't. They prove that their professional users are stupid idiots who will pay for a free computer to get a nice looking screen. Don't deny it, I can point you to a thread on the [H] itself. Then there's the trendy young things who want to look good at the coffee shop or big tech meeting. And the rest of their market is old people who want to have Apple Care.

LONG STORY SHORT: they lock their customers in with Final Cut X and pretty screens that are only slightly cheaper than a Dell screen. Anyone who believes otherwise is a fool.

Here is the Mac Pro 2013 performance compared to Windows on the same hardware. Notice that massive OpenGL performance gap on the same hardware?

A pro with serious workstation needs reviews Apple’s 2013 Mac Pro

CY2beUh.png


That doesn't translate to EVERY app, but it does make it painfully clear that the Mac platform is no performance demon.

SO YES, YOU HAVE SOMETHING TO PROVE. Your original post made the claim that Apple has optimized libraries that extract a lot more performance out if the same hardware.

And now you're just trying to wave me off and change your clalim to "great battery life." As if that's hard to do when the hardware is so pathetic.
 
Last edited:
And the rest of their market is old people who want to have Apple Care.

I don't necessarily agree their target market is solely idiots and old people, but I do agree Apple computers are moving to form over function.

I was going to buy a 5k 27" iMac until I saw a LinusTechTips video showing the CPU throttling under load due to high temps (CPU cooler was just not able to keep up with the heat generated). On paper, the specs look good and there is a great overall aesthetic; however, they made the conscious decision to trade off performance to maintain their designs.

Oh and the rumors they are thinking about a touchpad style keyboard for the MacBook...such a horrible idea. All to shave off those precious few mm's so they can say the new model is "The smallest, lightest MacBook ever".
 
I don't necessarily agree their target market is solely idiots and old people, but I do agree Apple computers are moving to form over function.

I was going to buy a 5k 27" iMac until I saw a LinusTechTips video showing the CPU throttling under load due to high temps (CPU cooler was just not able to keep up with the heat generated). On paper, the specs look good and there is a great overall aesthetic; however, they made the conscious decision to trade off performance to maintain their designs.

Oh and the rumors they are thinking about a touchpad style keyboard for the MacBook...such a horrible idea. All to shave off those precious few mm's so they can say the new model is "The smallest, lightest MacBook ever".

You're in the first market then (I identified three of them): people who want a nice screen and don't care what it costs. Some of them are professionals, and some of them are just screen junkies :D

You don't care about the lack of optimization of OS X, you just know how fucking expensive a high-end screen is. You're right, I shouldn't have said "stupid idiots," you are just focused on a single feature. Tunnel vision would be more appropriate :D
 
You're in the first market then (I identified three of them): people who want a nice screen and don't care what it costs. Some of them are professionals, and some of them are just screen junkies :D

You don't care about the lack of optimization of OS X, you just know how fucking expensive a high-end screen is. You're right, I shouldn't have said "stupid idiots," you are just focused on a single feature. Tunnel vision would be more appropriate :D

Though to be honest most people don't have the resources to calibrate their own screen.
 
LOL to be honest if I buy apple it will be for design and I would probably be using windows on it. LOL. I love the imac's as soon as they have decent power polaris chip I will prbably pick up one up and throw windows 10 on it and occasionally turn over to OSX side. OSX gets boring after a little use. Windows 10 is really smooth and best performing OS stability and speed wise from microsoft. Other than the privacy stuff people bring up, which I got no issue with but I don't think I could ever go back to previous versions of windows.
 
You're in the first market then

As I didn't buy one, I'm pretty sure I'm not in any of these markets you are mentioning for Apple computers. I chose to not buy a computer that was basically designed to throttle the CPU at high loads...regardless of the screen/specs/aesthetic. This throttling would be a problem in OS X and when using bootcamp/Windows.

If we are just talking OS X, I have no issues with it in general. My wife has an old MacBook Pro that runs as quickly today on the latest OS X as it did when we bought it in 2009...not many 2009 laptops can say the same making the jump from Vista to Windows 10. She is by far the opposite of a power user...in fact, OS X has saved her from installing viruses more times than I can count.
 
Back
Top