Nvidia begins developing Arm-based PC chips in challenge to Intel

Why would they ever remove said instruction, it seem to be a very small set of extra one.


Even if we do not use any apple library, the compilers that target Apple Silicon could use those extra instruction, no ?

They could remove it because they can provide a guarantee that it won't be problematic, as it's already abstracted away from mortals using whatever library that took advantage of it under the covers.

As long as it's considered undocumented, a compiler will generally never emit it, even if you're deliberately targeting the exact architecture. It kinda causes havoc on multiple levels - imagine you're looking at the disassembly as you look at optimizing something and something is just unexplainably wrong. There's just "garbage" data that the debugger decodes to nonsense - now you have one very confused programmer who is wondering if they're looking at a compiler bug because there's no reference for whatever they're encountering.

If you happen to know the opcode, you could do it with inline assembly, but there's probably not even a mnemonic for it. You'll wind up needing to poke the literal bytes representing the instructions to force the assembler's hand. This might even be what their library does behind the scenes.
 
As long as it's considered undocumented, a compiler will generally never emit it, even if you're deliberately targeting the exact architecture.
Why say it is undocumented: https://opensource.apple.com/source...st/MC/ARM/arm-memory-instructions.s.auto.html

Clang seem to support them and yhou do not need special library that take advantage of, something like that:
char a,b,c;
c = a+b;
if(c<0)
{
}

could use it.

How could GCC, clang, etc.. could use undocumented instruction ? what would it even mean ? That would only be possible for closed Apple compiler... Apple clang is open source
 
Last edited:
Why say it is undocumented: https://opensource.apple.com/source...st/MC/ARM/arm-memory-instructions.s.auto.html

Clang seem to support them and yhou do not need special library that take advantage of, something like that:
char a,b,c;
c = a+b;
if(c<0)
{
}

could use it.

How could GCC, clang, etc.. could use undocumented instruction ? what would it even mean ? That would only be possible for closed Apple compiler... Apple clang is open source

I don't know what that file is even in reference to, nor where it came from in this context.

It looks like it's just the expected output for a unit test to verify that the assembler is actually barfing out what's expected. At a glance, it all looks like standard ARM with nothing Apple specific.

Like you can just go hurl those instructions at an online ARM assembler... and sure enough you will get the same string of bytes that is mentioned on that page.

I'm referring to something like this.

It looks like they have a proprietary extension for matrix math.
https://github.com/corsix/amx
 
Honestly, Intel and AMD need some competition in the server space.
ARM is popular for cloud-based servers, but what if you don't want to run something in the cloud? Your options are non-existent.
I would welcome an ARM option, that was on an ATX form factor that wasn't a developer kit.
 
Honestly, Intel and AMD need some competition in the server space.
ARM is popular for cloud-based servers, but what if you don't want to run something in the cloud? Your options are non-existent.
I would welcome an ARM option, that was on an ATX form factor that wasn't a developer kit.

350 raspberry pis in a server rack in your garage
 
I mean, Nvidia has had ARM based development boards around forever. A decade? I forget what they call them. Jetson? Is that it?

I think moving the industry to ARM is a mistake, for a few reasons.

1.) Instruction set is technically irrelevant.

You can build a power efficient x86 chip or you can build a bloated an inefficient ARM chip. Yes x86 does have a built in performance penalty compared to more RISC based designs because it needs a decode stage to decode the instructions and turn them into RISC-like micro-ops, but the impact of this decode stage is less and less relevant every day.

Anyone remember the Asus Zenfone 2?

View attachment 653694

It was a pretty damn decent mid range Android phone.

Oh, and it was powered by an Intel Moorefield Atom x86 quad core SoC. I had one. It performed great, and power wise it had just as usable battery life as it's ARM-based brethren. I bought it as a disposable phone when I was in Brazil (just in case I got robbed, Gringos tend to get robbed in Brazil :p ) but I wound up keeping it when I got home from Brazil as I liked it more than my Motorola Droid Turbo I had at the time. The software was so-so and bloated, but custom ROM's fixed that. The hardware was excellent (except for maybe the camera)

And this was Intel's first foray into phones. Had they determined that it was worth their investment, I bet they could have improved it even more, but it turns out they really liked the profit margins in the PC space which they (at the time in 2014) still more or less controlled the market and could charge whatever they wanted to. It was a bit of a blunder on their part not pushing that harder (though knowing what we know now about the failure of their 10nm process) it would likely have died when they couldn't provide a competitive process anyway.

Instruction set pretty much only matters for binary compatibility these days. You can do almost anything, big or small, with almost any instruction set, provided you have enough bits for addressing.

What really matters is the underlying architecture of the chip design that supports that instruction set. That - more so than instruction set - is what determines if a chip performs well, or is efficient enough. From this technical ability perspective, Instruction set (x86 vs ARM vs RISC-V vs others) is really just a distraction.


2.) So why be interested in instruction set then?

Because it is all about control.

Instruction set is legally important. Who owns the intellectual property? Can they use it to lock the competition out, or lock their customer base in, or do any other borderline illegal market manipulations?

Intel has been doing this for ages, suing everyone and everything in order to maintain control over the PC market. They got so obsessed with it that they got distracted and really missed the whole mobile (or well, mobile smaller than laptops) and embedded IoT markets.

If the market as a whole is open to a shift in instruction set, using that opportunity to move from x86 to ARM is a terrible waste of an opportunity. At least from the consumers perspective. We'd be moving from one proprietary instruction set which has been the poster child for pseudo-legal market manipulations to harm consumers and competition, to another proprietary instruction set which while on the surface looks more free and open, is actually much worse.

Sure, ARM Holdings licenses their cores to anyone willing to pay. They have the potential to be a bad actor, but they haven't been thus far. Maybe we can trust them? As foolish and shortsighted as the concept of trust is when business and money are involved, the problem here isn't ARM itself, but its licensees. Let me explain.

The ARM license allows for customization. Licensees can and do customize the ARM designs such that it breaks binary compatibility, and et viola, instead of having a Intel/AMD duopoly, now every single little ARM licensee can make their own proprietary chip design that only runs their software and the consumer has no control over what so ever. Do as we say/want or go pound sand.

And while I like the concept of RISC-V being an open source instruction set, it does the same thing. Anyone can customize the instruction set as they see fit, and it can be used to proprietarize hardware and harm users and customers by not allowing them to use their own hardware that they own as they see fit.


3.) What to do about it?

I don't know. Things get worse every year. Every year we go further down the path of proprietary locked bullshit.

Personally, I would like a law that requires unlocked boot-loaders and binary compatibility for every device made available to consumers or off the shelf for enterprise, with the only exception being for highly specialized devices that place additional requirements on the hardware, such that a large market volume binary compatible CPU isn't possible.

Unfortunately this will - however - never pass the corrupt lobbyists who make their money from companies who rake in the billions by manipulating markets and abusing customers.

So, things are just going to get worse. Rather than a general purpose computer you can do whatever you want with, more and more you are going to be restricted, locked down and limited until you have no choices left at all.

Don't like telemetry and data collection on your fancy brand new laptop? Get pissed off that the manufacturer now forces you to watch ads every 5 minutes? Tough luck.

In the past you could at least run Linux or some other community software/OS, but now either your boot loader is cryptographically locked, or you have a custom ISA that prevents binary compatibility with anyhting else.

This is the way we are headed. If - that is - you will even be offered a computer to buy in the first place, and it doesn't all just become a cloud based subscription model.

The dystopia is real.
This is a very interesting analysis, and not one that I would expect to find in most places. But to be clear, I'm not a hardware professional. I'm just an Average Joe hobbyist.

Zarathustra[H] I'm curious as to how you came to your conclusions here. (No shade intended, just honest curiosity.) In your "real life" outside of [H] are you some sort of analyst, maybe for a technology consulting or market research firm? Again, no shade intended.

Are there other well-informed people who agree with you, outside of this thread?

Dystopia? Sure. Just look around.
 
This is a very interesting analysis, and not one that I would expect to find in most places. But to be clear, I'm not a hardware professional. I'm just an Average Joe hobbyist.

Zarathustra[H] I'm curious as to how you came to your conclusions here. (No shade intended, just honest curiosity.) In your "real life" outside of [H] are you some sort of analyst, maybe for a technology consulting or market research firm? Again, no shade intended.

Are there other well-informed people who agree with you, outside of this thread?

Dystopia? Sure. Just look around.

I am not an industry insider, but I have read a lot about the topic over the years. My conclusion is based on lots of tests, reviews and technical documents I have read over 20 years, none of which I have read recently enough to be able to cite here as supporting evidence.

Could I be wrong? Certainly. There is always that possibility. While I probably get a few small details wrong here or there because I am not actually in the industry, I feel pretty confident that I get enough of the broad strokes right that it does not impact my conclusions.

I'm happy to be proven wrong if someone has contradictory information though. That's why I am into this shit, because I find it interesting and like to learn, so if I have something wrong, I'd like to learn from it.
 
Honestly, Intel and AMD need some competition in the server space.
ARM is popular for cloud-based servers, but what if you don't want to run something in the cloud? Your options are non-existent.
I would welcome an ARM option, that was on an ATX form factor that wasn't a developer kit.
At this point I think ARM should be happy it corners the mobile market, as it's always done for decades. It's insane to see the lengths people will go to promote ARM, when there's no benefits and lots of incompatible software you can't run. Apple's biggest problem with their M-series is software. People are developing for it, but it's slow to catch up to what Apple x86 had, let alone what Windows x86 has. Switching processors is basically switching an entire software ecosystem. This wouldn't be such a problem if source code was released for every piece of software ever sold, but nobody does this. You can build the hardware all you want, but the software is the main issue.
Struggle to buy 1 from an actual Pi retailer at cost, 350 of them at a 1000% Amazon markup? Probably cheaper to buy a second-hand Altera system.
I don't see the value of RPi anymore. I own a few of them and I don't ever use them. Too slow for a full Ubuntu distro, and Raspbian is too Windows 95 like for me. RPi is only good for people who want the device to specialize in a single task, and even then not very well. There's now a growing amount of small x86 devices that cost a little more than a RPi. By a little I mean they're cheaper, and run Intel x86 CPU's that use less power and perform better than a RPi.

View: https://youtu.be/jjzvh-bfV-E?t=887
 
I don't see the value of RPi anymore. I own a few of them and I don't ever use them. Too slow for a full Ubuntu distro, and Raspbian is too Windows 95 like for me. RPi is only good for people who want the device to specialize in a single task, and even then not very well. There's now a growing amount of small x86 devices that cost a little more than a RPi. By a little I mean they're cheaper, and run Intel x86 CPU's that use less power and perform better than a RPi.

View: https://youtu.be/jjzvh-bfV-E?t=887


I've never used a Raspberry Pi. I think I have a 3 in a drawer somewhere that someone gave me once, but I never even powered it on.

I do - however - have three HardKernel Odroid N2+ units I use for Kodi/MyhtTV frontends, and for that purpose they do the job very well.

I do recall reviews saying they were considerably more capable units than at least the contemporary RPi models though.

What I don't understand is why all of these small ARM-based single board computers have to be handicapped storage wise. Skip the slow and annoying eMMC/MicroSD/USB bullshit and just give them real SATA or m2 slots, and then we'll be talking.
 
And this was Intel's first foray into phones. Had they determined that it was worth their investment, I bet they could have improved it even more, but it turns out they really liked the profit margins in the PC space which they (at the time in 2014) still more or less controlled the market and could charge whatever they wanted to. It was a bit of a blunder on their part not pushing that harder (though knowing what we know now about the failure of their 10nm process) it would likely have died when they couldn't provide a competitive process anyway.

Yeah, it's a real shame Intel killed development right before Microsoft demoed Continuum. x86 windows mobile 10 could have been really interesting.
 
I've never used a Raspberry Pi. I think I have a 3 in a drawer somewhere that someone gave me once, but I never even powered it on.
I have a RPi Model B which means it has an amazing 512MB of ram. I tried to use it for Kodi, but if you don't pay extra for the decoder stuff it's slow. I did buy the decoder unlock codes, but it was still slow. I also have two RPi 3's, which has an eye watering 1GB of ram. I tried Ubuntu on those, but it was a painfully slow experience. I just installed Raspbian and forget about them.
I do - however - have three HardKernel Odroid N2+ units I use for Kodi/MyhtTV frontends, and for that purpose they do the job very well.
I am in the market for a couple of small devices. Trying to convince my family to dump cable TV and use Roku/Netflix/Hulu/etc. What I need is a UI that works fine with remotes and can actually use all these services. Maybe Kodi, but I certainly need hardware for it. The price of that device is more than an Intel based Mini PC off Aliexpress. More ram, more storage, more everything. 4GB of ram vs 6GB. No storage vs 128GB of storage, which I wouldn't be shocked is upgradable with M.2. No Wifi vs Wifi5. NO BT vs BT4.2. Because it's x86, I can install anything I want. Probably have to worry about hardware acceleration on the O2, where as the Intel based machine will certainly work even at 4K with no dropped frames.
What I don't understand is why all of these small ARM-based single board computers have to be handicapped storage wise. Skip the slow and annoying eMMC/MicroSD/USB bullshit and just give them real SATA or m2 slots, and then we'll be talking.
They're built on the cheap. Which means they're really cheap compared to the Mini PC's using Intel on Aliexpress.
 
Trying to convince my family to dump cable TV and use Roku/Netflix/Hulu/etc.
Roku devices are pretty good UI (depending on the service, some of them write real shitty UIs), and easy for regular people to use.

Integrations into fun personal library stuff is kind of not great at the moment though. There's a lot of things that kind of work and kind of don't. And there's some stuff where someone claims to have had it working a few years ago, but it doesn't work now. I have a jellyfin client that mostly works, but doesn't want to play some content without transcoding that I think should be fine... And I never got anything to work well with MythTV.

There's some ads and tracking that people don't like, but if you turn off what you can and ignore some stuff on the home screen that you can't turn off, Roku gets out of the way when you load your app of choice.

My oldest Roku Ultra feels real slow now, but the newer ones are fine. They don't have gigE, so that's kind of sad --- i want to get UHD blue-ray rips with no transcoding, and that peaks above 100mbps.
 
Roku devices are pretty good UI (depending on the service, some of them write real shitty UIs), and easy for regular people to use.
Roku maybe my last resort if I can't get everything working in Kodi. The problem is I'd rather have a full blown PC with Linux+Kodi, if that can do things like Netflix+Hulu+Disney+ and etc. And of course a decent remote.
Integrations into fun personal library stuff is kind of not great at the moment though. There's a lot of things that kind of work and kind of don't. And there's some stuff where someone claims to have had it working a few years ago, but it doesn't work now. I have a jellyfin client that mostly works, but doesn't want to play some content without transcoding that I think should be fine... And I never got anything to work well with MythTV.
I have Jellyfin and forgot about MythTV. I'm not even sure if MythTV is relevant in 2024. It was a tool to watch live TV from a capture card, which I do have setup with Jellyfin for OTA TV stuff. I have everything needed, I just need a way to dumb it down.
There's some ads and tracking that people don't like, but if you turn off what you can and ignore some stuff on the home screen that you can't turn off, Roku gets out of the way when you load your app of choice.

My oldest Roku Ultra feels real slow now, but the newer ones are fine. They don't have gigE, so that's kind of sad --- i want to get UHD blue-ray rips with no transcoding, and that peaks above 100mbps.
If Roku allowed me to install Android apps, I'd be all for it. Might go YouTube TV because it's cheap and has over 100 channels for $73 a month. I'd rather see if my parents can deal with Netflix/Hulu/etc. Fubo doesn't look too bad, but it's $90 per month and that's approaching cable prices.
 
Roku maybe my last resort if I can't get everything working in Kodi. The problem is I'd rather have a full blown PC with Linux+Kodi, if that can do things like Netflix+Hulu+Disney+ and etc. And of course a decent remote.

I have Jellyfin and forgot about MythTV. I'm not even sure if MythTV is relevant in 2024. It was a tool to watch live TV from a capture card, which I do have setup with Jellyfin for OTA TV stuff. I have everything needed, I just need a way to dumb it down.

If Roku allowed me to install Android apps, I'd be all for it. Might go YouTube TV because it's cheap and has over 100 channels for $73 a month. I'd rather see if my parents can deal with Netflix/Hulu/etc. Fubo doesn't look too bad, but it's $90 per month and that's approaching cable prices.
yeah i recently got my parents on Roku a little while back and they like it so far. the ui's fast and responsive, they already cancelled satellite and were gonna go with youtube tv to save money but you might find out that there's so much stuff on roku that you don't even need youtube tv? because besides all the free stuff on there, they've already got netflix for free from some deal with their cellphones and amazon prime. but i guess that just depends on your household. sucks hearing about what rossman said and about them getting hacked but seems like there's hardly anywhere that's 100% safe these days.
 
Skip the slow and annoying eMMC/MicroSD/USB bullshit and just give them real SATA or m2 slots, and then we'll be talking.
The cheap processors they used to use prevented that. But the Pi 5 has a single exposed PCIe lane, and supports nvme now, as do other boards with more powerful socs, like the radxa rock 5b with its rk3588.
 
I am in the market for a couple of small devices. Trying to convince my family to dump cable TV and use Roku/Netflix/Hulu/etc
I'm also interested. In my house. we have old Roku sticks on each (old) television. Also trying to convince my wife to upgrade to 4K TV. I thought I would get new 4K Roku sticks, but I'm open to something else.


How would you configure these devices w/r/t TV, etc..
he price of that device is more than an Intel based Mini PC off Aliexpress.
Heard too many bad things about Aliexpress.
 
but I'm open to something else.
The Amazon one during the primes days can get really cheap for the hardware, easy to "sideload" app on them.

Why do you say that?
If it need for Windows on ARM to be good enough for it to work, it is a big risk, Qualcoom seem to have a list of issues going on.

You can execute really well and still loose, Windows can get good enough by now, execute well and still loose, if they share enough with the Arm data center CPUs for the benefit of scale pricing and yield.... they can maybe price war their entry... and create a virtuous circle of ARM app support (and console choose them for the futur gen),, but that many move going well.
 
At this point I think ARM should be happy it corners the mobile market, as it's always done for decades. It's insane to see the lengths people will go to promote ARM, when there's no benefits and lots of incompatible software you can't run. Apple's biggest problem with their M-series is software. People are developing for it, but it's slow to catch up to what Apple x86 had, let alone what Windows x86 has. Switching processors is basically switching an entire software ecosystem. This wouldn't be such a problem if source code was released for every piece of software ever sold, but nobody does this. You can build the hardware all you want, but the software is the main issue.

I don't see the value of RPi anymore. I own a few of them and I don't ever use them. Too slow for a full Ubuntu distro, and Raspbian is too Windows 95 like for me. RPi is only good for people who want the device to specialize in a single task, and even then not very well. There's now a growing amount of small x86 devices that cost a little more than a RPi. By a little I mean they're cheaper, and run Intel x86 CPU's that use less power and perform better than a RPi.

View: https://youtu.be/jjzvh-bfV-E?t=887

I have several PiZeroW's and W2's installed inside devices acting as network-enabled monitors, digital sign boards, or remote terminals. The Zero W's though have mostly been relegated to functioning as WiFi-enabled USB keys, so I can remotely push files to them for firmware updates, displays etc. Private little IoT network.
Then for the 3D print labs, I have some 4's that run Octoprint, and of course, I have one at home for emulators, just the classics, it does anything from the 90s down and well enough.

But they were all through the proper suppliers at MSRP or below, I wouldn't touch them at anything higher than that.

Server side though, ARM makes a fantastic CPU for highly transactional databases, you need lots of little cores there not a few big ones, so telecom, SQL, web, etc. They run very cool and quiet.
Linux ARM64 is stable and abundant, so there isn't much in the way of downsides if that's what you're running, the ARM servers I have in the cloud don't even have GUI it's all web interfaced or command line, and have very few complaints there so they are cheap, and they do what they need to.
 
Last edited:
How would you configure these devices w/r/t TV, etc..
I'm still figuring this out. I'd have no problem putting in Linux and placing a bunch of icons on the desktop, but that's not dumb enough. Tried to see if Kodi with plugins would work, like Slyguy and Netflix plugin. Not the user interface I was expecting. It's like browsing through file explorer for services like Hulu and Netflix. Alternatively, I could find a way to link these to Jellyfin. Unfortunately, I don't see any plugins for this.
Heard too many bad things about Aliexpress.
It's better than any ARM based device. Nvidia does have their Shield TV devices, but in the end it just runs Android and costs at least $150.
 
I have several PiZeroW's and W2's installed inside devices acting as network-enabled monitors, digital sign boards, or remote terminals. The Zero W's though have mostly been relegated to functioning as WiFi-enabled USB keys, so I can remotely push files to them for firmware updates, displays etc. Private little IoT network.
Then for the 3D print labs, I have some 4's that run Octoprint, and of course, I have one at home for emulators, just the classics, it does anything from the 90s down and well enough.

But they were all through the proper suppliers at MSRP or below, I wouldn't touch them at anything higher than that.

Server side though, ARM makes a fantastic CPU for highly transactional databases, you need lots of little cores there not a few big ones, so telecom, SQL, web, etc. They run very cool and quiet.
Linux ARM64 is stable and abundant, so there isn't much in the way of downsides if that's what you're running, the ARM servers I have in the cloud don't even have GUI it's all web interfaced or command line, and have very few complaints there so they are cheap, and they do what they need to.

We have a production database (well, and dev too) humming along on the AWS ARM chips.

They seem to perform very well and are quite aggressively priced.

There's actually no real blockers to moving more of our workloads onto ARM hardware really other than we're simply not building ARM containers right now.
 
We have a production database (well, and dev too) humming along on the AWS ARM chips.

They seem to perform very well and are quite aggressively priced.

There's actually no real blockers to moving more of our workloads onto ARM hardware really other than we're simply not building ARM containers right now.
Back in 2022 I moved all our outward facing web services off site. And had almost no budget to do it and the AWS ARM instances were priced where I needed them so it was very much a case of well, it’s got to be better than what I’ve got these running on now.
My biggest headache was getting the VPN tunnel stable.
 
It's better than any ARM based device. Nvidia does have their Shield TV devices, but in the end it just runs Android and costs at least $150.
My comment was about AliExpress as an online store. According to comments, lots of crappy products and/or unethical manufacturers, for which we often roast Amaxzon.
 
My comment was about AliExpress as an online store. According to comments, lots of crappy products and/or unethical manufacturers, for which we often roast Amaxzon.
Which product wasn't made in China? I don't think there's a product that's made in China that's ethical. There are many cases where the same product I find on Amazon is also found on Aliexpress, just cheaper. Also, Amazon isn't a place to find products ethical free. Bought a number of SD cards that were fake from Amazon. The only reason I shop on Amazon is to get things sooner than later.
 
Which product wasn't made in China? I don't think there's a product that's made in China that's ethical.

Manufactured in China, by a company based in Taiwan or elsewhere? Or designed in China by a company under the thumb of the CCP?
There are many cases wher Also, Amazon isn't a place to find products ethical free.
An understatement if I ever saw one!!!
 
Last edited:
Back
Top