Apple had already been RISC prior to Intel, and they didn't go back - they went to ARM. I think that says something... Now, it's a been a few years since Apple was RISC, and I'm sure RISCV has a lot of evolution, but I'm certain Apple looked at it and they still went the way they went.
Well, IBMs Power RISC-based architecture isn't RISCV. Remember that 'RISC' is just an approach to CPU design; ARM is RISC too, for example.
And I don't think that Apple would go RISC V. They own the ARM IP they're using, they need no further license, and their designs are all their own as well. Of course, Apple's use of ARM does little to further or hamper ARM everywhere else. What is Apple's is Apple's. And aside from an ISA, Apple's ARM has very little in common with any other architecture out there. They have their own compilers, customized OS with custom kernels and custom frameworks and custom drivers and... well, Apple isn't a barometer, at least not a useful one, for ARM in the rest of the industry.
That, and there's a lot invested in ARM right now - all of that doesn't translate right over to RISCV without a lot more re-engineering.
Sure!
But there's
also a lot invested in making software not dependent on the underlying hardware too. Which means that a RISC V CPU that runs Linux could be made to run just about anything with ease. It'd probably dog slow, but it would run.
So I wouldn't exactly say you can just jump from ARM right to RISCV in the event nVidia closes the ARM door or ARM shrivels up and dies.
I wouldn't either, and I apologize if I implied that
It's more that ARM isn't the only game in town for cheap CPUs. They're very good, and when well-tuned for a particular workload they are very, very effective, but for situations where the CPU doesn't have to be fast, it just has to
work, Nvidia's influence could easily see ARM priced out of those markets. And in a era where high-end fab space is at an incredible premium, it makes sense for whoever is directing ARM to direct efforts toward the more profitable ventures, the same way Nvidia does with their GPUs.
Really, and this is absolutely off-topic except for the fab discussion, Intel's low-end GPU entry makes sense because they're getting built while AMD and Nvidia simply cannot waste fab allocations on lower-end parts. Not when high-end parts are going for today's supply-constrained, extreme-demand pricing.