Apple's 'Let Loose' iPad Air and iPad Pro event: M4?

This at first glance look quite impressive ?

geekbench.png


+22% from the 13inch laptop form factor in multi thread ?

Like mentionned this i imagine for most only translate in much more battery life (has the Ipad will always just use very little of the chips), which is not useless for an Ipad...

That said it is probably telling that after scrolling fast over 4 reviews, I have seen not a single actual application benchmark (a bit like we often see with new nvme SSDs drive), outside battery life maybe all that extra power feel different when you use it but not something easy to put number on for what people do with them ....
 
Maybe they'll do something useful with iPadOS later this year? Because as of right now, the iPads can't even take advantage of all this excess power. The OS is really holding it back.
 
Same thinking here, my wife's Air 2 is probably getting replaced for her birthday. I've looked at Android tablets and owned them in the past plus I know they're better hardware and value for the money, but I can never get used to the Android OS/UI. I was looking at the comparison page for ipads (https://www.apple.com/ipad/compare/?modelList=ipad-air-2,ipad-10th-gen,ipad-air-11-m2), I guess I'm leaning toward either the base 10th gen iPad or the new Air. The 10th gen iPad is "only" $350, but the improvements over our Air 2's are pretty limited - mostly it's the new processor, a fresh battery, and the ability to run new iOS versions.

I was considering a Galaxy Tab for the included stylus and OLED screen for media consumption. This upgrade for the new Air really isn’t exciting me a whole lot considering what I can get with the Galaxy, but I do agree with you on I just prefer the feel of iOS to Android, and I have used both. Tough decision to make!
 
This at first glance look quite impressive ?

View attachment 653725

+22% from the 13inch laptop form factor in multi thread ?

Like mentionned this i imagine for most only translate in much more battery life (has the Ipad will always just use very little of the chips), which is not useless for an Ipad...

That said it is probably telling that after scrolling fast over 4 reviews, I have seen not a single actual application benchmark (a bit like we often see with new nvme SSDs drive), outside battery life maybe all that extra power feel different when you use it but not something easy to put number on for what people do with them ....
The issue with the iPads and such and the benchmarks is very few things push the chips to 100% so what it translates to most is better battery.

For most tasks we’re talking about going from 7s down to 6.2 seconds, 8% CPU/GPU utilization instead of 12% which nets you an extra hour over a day.
Or maybe it’s things like skipping ahead in a Netflix or Disney movie and the movie just picking up right away and not stuttering.
If you are using the iPad for Adobe and such I would expect it to be somewhat meaningful that said the M1 variants we have are getting the jobs done in their allotted time slots.
I’m tempted my the M4, but I also have some $900 loaded up into the GW shopping cart, and I’m pretty sure I’ll get in trouble if I do both…
 
I was considering a Galaxy Tab for the included stylus and OLED screen for media consumption. This upgrade for the new Air really isn’t exciting me a whole lot considering what I can get with the Galaxy, but I do agree with you on I just prefer the feel of iOS to Android, and I have used both. Tough decision to make!

i've also used iPad and Galaxy Tab. Currently using the Galaxy Tab since for work, i can use it as a second monitor with the Windows laptop. The aspect ratio is also more optimized for media consumption. Other general use, it doesn't make a difference between the Tab and iPad for me.
 
This at first glance look quite impressive ?

View attachment 653725

+22% from the 13inch laptop form factor in multi thread ?

Like mentionned this i imagine for most only translate in much more battery life (has the Ipad will always just use very little of the chips), which is not useless for an Ipad...

That said it is probably telling that after scrolling fast over 4 reviews, I have seen not a single actual application benchmark (a bit like we often see with new nvme SSDs drive), outside battery life maybe all that extra power feel different when you use it but not something easy to put number on for what people do with them ....
Am I reading this right? The M4 chip beats Intel's 14900K's single core score of 3098 vs 3682?! wth...?
 
Geekbench is known to heavily favor Apple. Even Cinebench is questionable. Always go by real world applications for testing.
Yep. Geekbench is mostly meaningless for real world use. The testing in reviews will paint a clearer picture.
 
Geekbench is known to heavily favor Apple. Even Cinebench is questionable. Always go by real world applications for testing.
Microsoft is claiming that their last surface beat apple M3 in Geekbench by 15%... will have to see if they did not play some tricks with the power level.

Yep. Geekbench is mostly meaningless for real world use. The testing in reviews will paint a clearer picture.

A lot of what geekbench are just super common actual real world usage, it use a clang compiler to compile, it decompress jpeg, do sqlite entry, make preview thumbnails, open and render big PDF, browse the web on html5 website, compress files, blur background on voice call, common CAD-video game physics stuff, is there really a disconnect between the 2 > would be surprising that a test that include so much of the most common real world usage task is meaningless for real world use at the end.

7900x seem to score 33% higher than the 5900x, TPU has it a 33% for applications, Geekbench multithread jump is quite more generous too (+50%)
 

It's actually a lot more durable than I was expecting. From the video:

"It holds up surprisingly well. Like suspicious black magic levels of structural integrity going on."

It looks like horizontal bending is fine. Vertical bending, not so much. But nobody is going to be bending their devices that way with that amount of force, at those specific points of pressure, with the giant meat palms that Zack possesses. The iPad probably won't keep a straight profile if it's going into a book bag for school or something for months at a time unless it has a case on it or something.
 
A lot of what geekbench are just super common actual real world usage, it use a clang compiler to compile, it decompress jpeg, do sqlite entry, make preview thumbnails, open and render big PDF, browse the web on html5 website, compress files, blur background on voice call, common CAD-video game physics stuff, is there really a disconnect between the 2 > would be surprising that a test that include so much of the most common real world usage task is meaningless for real world use at the end.

7900x seem to score 33% higher than the 5900x, TPU has it a 33% for applications, Geekbench multithread jump is quite more generous too (+50%)
More often we see Geekbench claiming Apple is faster when real world tests show it wasn't. Geekbench's GPU test for Apple is very generous, when compared to RTX graphic cards, but in real world tests we see that RTX destroys anything Apple has. Same goes for gaming, video rendering, and etc. There's a reason why nobody in the PC tech industry uses synthetic tests anymore. When did you see Gamers Nexus, Hardware Unboxed, or even Linus Tech Tips use Geekbench?
 
synthetic tests anymore
Compiling a lua interpreter with clang, compressing files, bluring background or opening a pdf is as real world as you can get, what percentage of the geekbench score come from synthetic benchmark ?

but in real world tests we see that RTX destroys anything Apple has.
In the image processing, photography, computer vision, machine learning, physic simulati0on and openCL type of task geekbench evaluate ?

With an old 2080 super beating an Apple m2 ultra:
https://browser.geekbench.com/opencl-benchmarks
https://browser.geekbench.com/metal-benchmarks
Or a Radeon 6700xt beating the m3 max in metal.... is it really out of place ?

It try to calculate way too many different things at once to stay interesting if you have time to test a lot of stuff individually, I imagine. they do not show passmark score either I would imagine.
 
The problem with geekbench is not what it does, it is the data it is doing it on. The data is too small, so repeated tests are fitting better into CPU caches. For example, decompressing one jpeg repeatedly is more cache-friendly than dragging enough pictures around to do each one only once.
 
decompression is tested with for example a 75 MB archive with 9,841 files, photo library has 64 of them, the database metadata has 70,000 entry, text processing ron on 190 files, lot of those will not reach max temp on a modern 24 core cpu, but are not unrepresentative of common real world scenario, the blender BMW scene they use is small now but a classic, pdf has 16 of them.

https://www.geekbench.com/doc/geekbench6-benchmark-internals.pdf
Lot of them still go over 1 gig of ram usage.

Are those workload not more representative to the most common realworld average usage by average users than workload that push a system to 100%, more than 2 minutes straight ? It is subjective.
 
  • Like
Reactions: uOpt
like this
Back
Top