Surprise: Mac Studio with M1 Ultra Can’t Beat NVIDIA GeForce RTX 3090, Contrary to Apple’s Claims

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,595
Points
113
This may come as a major shocker, but it turns out that Apple had its Reality Distortion Field tuned a little too high when it claimed that the new Mac Studio with M1 Ultra chip could compete, or even beat, NVIDIA's flagship GPU in performance.

Go to post
 
I'm disappointed, not surprised.

I really want someone to come out and redefine the field. Even if it is someone LIKE apple. What it will take is someone redefining how to achieve performance. Maybe that will be with new quantum processors of whatever. I know it's coming. I just want to see a few new iterations more in my lifetime. Gotta be able to upload my mind to the cloud. I wonder if I will be self aware in that state... hummm...
 
Just breezed through the review. The short gist of it is that the Mac Studio is the uber-fast Adobe machine. If you use Adobe products you will love it. Point click, it works, and it's blink-you're-done fast at it too. Beyond that and well, it could be another story depending on what you want to do.

Meanwhile, I have to laugh at the good/bad stuff, "Expensive" followed by "What you buy is what you're stuck with" cracked me up.
1647551124730.png

Edit: All the while I'll look over at my MSI Suprim 3090 and be happy knowing that its got years before it'll really need to be replaced for my needs.
 
I think the M1 Ultra chip is probably better on performance/watt.

Also, an ARM CPU that completely destroys any console is no small feat. We have yet to see any x86 APU laptop/desktop that has better performance than a Xbox Series X, particularly on the graphics front.

Having a chip that easily doubles the performance of last years Apple's top end is something that neither intel nor AMD have achieved in decades.

At this point I even feel bad for nvidia for loosing the ARM deal. Who knows what had they would have come up with.
 
I think the M1 Ultra chip is probably better on performance/watt.
Hard to contest, and probably going to be hard to beat in the near future. Apple has access to the most advanced process available, tuned for their use case, and control the entire software stack than runs on their processor designs produced on it.

It's a game they've been playing very, very well for the last decade.

Also, an ARM CPU that completely destroys any console is no small feat.
Reluctantly agree, though again, Apple is leveraging advantages here, including shorter release cycles vs. consoles. On even ground the comparison might not be that stark, and more like Apple vs. Android in phones, where there tradeoffs more than outright advantages, and many of those coming down to individual opinions.

We have yet to see any x86 APU laptop/desktop that has better performance than a Xbox Series X, particularly on the graphics front.
This is entirely by choice, likely driven by a lack of need in the market. AMD could release such a product at any time. As could Intel, at this point.

But for all the investment in bringing a high-powered APU to market - which for the record, I'd love to see! - most either want something that's competent but efficient, or are willing to put up with a dGPU, regardless of whether laptops or desktops are the point of comparison.

Having a chip that easily doubles the performance of last years Apple's top end is something that neither intel nor AMD have achieved in decades.
Can't say I agree here - Apple just very spectacularly glued two M1 Max cores together to produce the M1 Ultra, and that's something Intel and AMD do every time they double core counts (same for AMD and Nvidia in terms of GPUs). Apple's solution does appear to make it more likely that workloads will actually see that doubling of raw performance, but again, there's some selection bias involved, and we can't really test gaming on the M1 Ultra directly with an RTX 3090.

At this point I even feel bad for nvidia for loosing the ARM deal. Who knows what had they would have come up with.
I was tossing a coin between appreciating them for how much they might be able to push the industry forward, while also feeling disdain for the inevitable corporate bullying that would likely have exemplified their 'pushing'.

I'm betting that their failure to acquire ARM won't slow them down too much. Their CEO has an ego to keep inflated, after all!
 
Hard to contest, and probably going to be hard to beat in the near future. Apple has access to the most advanced process available, tuned for their use case, and control the entire software stack than runs on their processor designs produced on it.

It's a game they've been playing very, very well for the last decade.


Reluctantly agree, though again, Apple is leveraging advantages here, including shorter release cycles vs. consoles. On even ground the comparison might not be that stark, and more like Apple vs. Android in phones, where there tradeoffs more than outright advantages, and many of those coming down to individual opinions.


This is entirely by choice, likely driven by a lack of need in the market. AMD could release such a product at any time. As could Intel, at this point.

But for all the investment in bringing a high-powered APU to market - which for the record, I'd love to see! - most either want something that's competent but efficient, or are willing to put up with a dGPU, regardless of whether laptops or desktops are the point of comparison.


Can't say I agree here - Apple just very spectacularly glued two M1 Max cores together to produce the M1 Ultra, and that's something Intel and AMD do every time they double core counts (same for AMD and Nvidia in terms of GPUs). Apple's solution does appear to make it more likely that workloads will actually see that doubling of raw performance, but again, there's some selection bias involved, and we can't really test gaming on the M1 Ultra directly with an RTX 3090.


I was tossing a coin between appreciating them for how much they might be able to push the industry forward, while also feeling disdain for the inevitable corporate bullying that would likely have exemplified their 'pushing'.

I'm betting that their failure to acquire ARM won't slow them down too much. Their CEO has an ego to keep inflated, after all!
I think the lack of APUs for gaming laptops has more to do, at least in the case of AMD, with their contracts with consoles. Its only now (upcoming months actually) that RDNA2 APUs are being released. I do think there IS a market for gaming APUs that can rival consoles. And I see intel going that route as they are not limited by any console contracts.
 
Thing is, nobody "wants" a high powered APU. Especially at the prices Apple is charging. If they are targeting people doing very specific tasks and workloads that is a very niche market of people that want/need the ability to upgrade their GPU as often as they want. At least that's what I've heard from a number of architectural, structural, electrical, civil, nuclear, etc engineers that my wife works with. None of those guys are even remotely interested in Apple products.

Then again, this is Apple, and there will be a lot of people who buy one just to brag about having one and not actually needing one.
 
I think the lack of APUs for gaming laptops has more to do, at least in the case of AMD, with their contracts with consoles. Its only now (upcoming months actually) that RDNA2 APUs are being released. I do think there IS a market for gaming APUs that can rival consoles. And I see intel going that route as they are not limited by any console contracts.
It's a fine theory, but realistically it's just as easily explained for there not being a large potential market. A big part of that is based in what you already see - APU performance that would approach dGPU performance would need dGPU levels of memory bandwidth. DDR5 gets a lot closer to that, but it's still a fraction of what a decent entry-level dGPU can bring. Thing is, APUs generally go into entry-level systems, meaning that for now it's a hard business-case to make, most especially when AMD is already supply constrained at TSMC.

Thing is, nobody "wants" a high powered APU. Especially at the prices Apple is charging.
Actually? People are loving Apple's APU, even at the prices they're charging. Thing is, Apple's APU has gobs of memory bandwidth to share between the CPU and GPU like AMD's console APUs do, but unlike any desktop-class APU. Apple's GPUs then are able to put out dGPU-class performance numbers, whereas any APU using a standard desktop memory controller is going to be severely bandwidth limited in comparison.

If they are targeting people doing very specific tasks and workloads that is a very niche market of people that want/need the ability to upgrade their GPU as often as they want.
Well, they're targeting content creators, from the consumer to amateur to professional level. For that 'niche' market, Apple has tuned their entire hardware and software stack exceptionally well. And when you think about it, from a broad computer market perspective, that's the heaviest work that most people do. Apple is definitely on point when it comes to shipping high-performance systems.

At least that's what I've heard from a number of architectural, structural, electrical, civil, nuclear, etc engineers that my wife works with. None of those guys are even remotely interested in Apple products.
I mean, sure? When you get that niche, you're likely looking at what is best suited to your very specific workload. I'd imagine Windows or Linux and leaning heavy on CUDA and OpenCL. Apple's walled garden with custom APIs is likely a pretty big turnoff, as would be the inability to upgrade GPUs / Compute Engines.

Then again, this is Apple, and there will be a lot of people who buy one just to brag about having one and not actually needing one.
I'm sure that's prevalent, but the thing is, Apple is making the best laptops one can buy, and some of the best computers one can buy, assuming that desktop gaming isn't a priority.

I say that as I've been contemplating both a 14" MBP, because it'd run circles around my 15" XPS while being lighter, quieter, and having over twice the battery life, and also contemplating the latest 'Mac Studio', but in a minimal configuration. Thinking about it now, I'd probably just go with a Macbook Pro, but having a 10Gbit interface on the Studio is tempting!
 
I do t dislike the Mac garden at all. They do make solid devices and for content creators it's hard to get better bang for the buck and life cycle.

And the prices for these Mac studio devices are not crazy for content creation workstations.

If I were in this market I would be hard pressed to find a better value.

Diy... well they are out for a multitude of reasons.. but then we are not their target market either.
 
I'm not by any means an Apple or Mac fan for a multitude of reasons, not the least of which is usually the price. However, neither will I deny when they hit the mark. They shouldn't have tried to make comparison claims to the 3090 but meanwhile, this machine does what it's designed for and does it the best way possible. It's not a gaming machine. It's for content creation.
 
It's a fine theory, but realistically it's just as easily explained for there not being a large potential market. A big part of that is based in what you already see - APU performance that would approach dGPU performance would need dGPU levels of memory bandwidth. DDR5 gets a lot closer to that, but it's still a fraction of what a decent entry-level dGPU can bring. Thing is, APUs generally go into entry-level systems, meaning that for now it's a hard business-case to make, most especially when AMD is already supply constrained at TSMC.

If you think there is no market for performance APUs, think again. Apple is just the tip. The rest of the ARM pack is right behind. Probably none of them will reach Apple capabilities any time soon, but they could still give intel/AMD a huge blow, specially in the laptop market.
 
If you think there is no market for performance APUs, think again. Apple is just the tip. The rest of the ARM pack is right behind. Probably none of them will reach Apple capabilities any time soon, but they could still give intel/AMD a huge blow, specially in the laptop market.
Well, no market for AMD at the moment, and no way to produce them.

The biggest impediment is that the low volume would keep unit prices higher than just tossing in a dGPU, which would also be faster. AMD would have to design a different platform out of it to give it legs, and then get OEMs to adopt it.

Cost / benefit isn't there, even if many of us would want to see a higher-performance APU.
 
If APU's were everything and everyone wanted a inexpensive apu system converted Xbox's would rule the home compute space and workstation space.
 
AMD could graft X number of RDNA3 units and X GB of DDR5 or HBM onto infinity fabric with X number of Ryzen 4 cores and have insane performance beating any CPU/GPU combo. The question is, how much would it cost?

Edit:. Hell, add a few TB of nonvolatile storage to that infinity fabric while we're at it.
 
AMD could graft X number of RDNA3 units and X GB of DDR5 or HBM onto infinity fabric with X number of Ryzen 4 cores and have insane performance beating any CPU/GPU combo. The question is, how much would it cost?

Edit:. Hell, add a few TB of nonvolatile storage to that infinity fabric while we're at it.
True... when your die grows to be 30% of an ATX motherboard and your power delivery needs grow at what point are you designing something JUST to win a number and not be marketable?

The die size coming from these new APU's from Apple I am rather confident are HUGE. The question is how big is too big? Do we want motherboards having to deliver 300, 500, 1000 watts of power to the CPU to be handled by what... a dual radiator loop?

With an neigh unlimited number of CCX's, memory and GPU processing cores of your choice we could easily build something that is going to topple everything else. But the cost would be ASTRONOMICAL.
 
Can't say I agree here - Apple just very spectacularly glued two M1 Max cores together to produce the M1 Ultra, and that's something Intel and AMD do every time they double core counts (same for AMD and Nvidia in terms of GPUs). Apple's solution does appear to make it more likely that workloads will actually see that doubling of raw performance, but again, there's some selection bias involved, and we can't really test gaming on the M1 Ultra directly with an RTX 3090.
I think this is closer to SMP with multiple CPU packages on a motherboard, or SLI/CF - or the best example I can think of are those double GPU cards like the 5970 or GTX 690, and to some extent the chiplets that AMD uses in it's CPU packages.

When you add cores to a package, you usually still have common memory and I/O access, a common cache level, and a few other things that help keep things moving along nicely when you have work that needs to cross computational units.

Yeah, there are two dies glued together in the same package. But they don't share common cache or registers or anything - they have to work over that interposer. And that interposer is the real news story about the M1 Ultra.
 
The die size coming from these new APU's from Apple I am rather confident are HUGE.
The M1 Max is 432mm2. I can't find the official size of the Ultra, but given that it's two M1 Max's on an interposer - 864 isn't going to be too far from the mark. But it's not fabbed as one big die, it's fabbed as two 432 dies and then placed (very carefully) on an interposer (part of the reason I state that it is different than just adding more cores to a die - more akin to AMD Chiplets). It clocks as the highest commercially available transistor count processor package, at 114 Billion transistors.

Compared to:
RTX 3090 - 628mm2, 28 Billion transistors
Core i9 12900k - 215mm2, transistor count not disclosed (believed to be approx 10.5 Billion)
AMD Epyc 7742 (64-core) - 74mm2 per CCD, 416mm2 IOD, 1,008mm2 package, 29.5 Billion transistors

Largest Die I could find: Wafer Scale Engine - 815mm2, >2,000 Billion transistors
 
Yes amd could make an hbm laden ryzen laden rdna/ cdna laden god mode apu. Without reality distortion field capability, it wouldn't sell. Intel, amd and nvidia shouldn't have a problem optimizing to the wazoo for adobe either, perhaps they should just to screw apple. I wouldn't put it past nvidia to do it and include it in every new nvidia card if ever someone in there is bored/ angry in the company.
 
Intel, amd and nvidia shouldn't have a problem optimizing to the wazoo for adobe either
I don't have any sales figures, but it wouldn't be hard to imagine that Adobe sells more licenses for Windows than Mac, since something like 93% of all PC's run Windows.

So, I would think, it is already optimized out said wazoo, within the constraints of the existing designs.

That said, the design targets for those products are a lot different. A very good chance Apple optimized their hardware around tasks like Adobe, Final Cut, etc. Whereas nVidia/AMD are optimizing for gaming and AI, and Intel is optimizing for Passmark.

So yeah, you are right, now that I've talked myself around in a circle there would be room for the 3 Stooges to optimize further, but I would think it might come at the expense of some of the optimizations they are implementing for more popular or general use cases.

(I kid with the Passmark part there - mostly)
 
What do you think would happen if Adobe asked Intel, AMD or Nvidia to make custom Adobe- box hardware optimized. The plan be for Adobe to rent A-boxes at some amount of money a month.
Adobe could create a bare OS, have all the needed goodies for running their stuff, a browser some Adobe shop store, and perhaps some office software and such.
If they are smart the would allow the system to be open enough for general purpose, which would give them a headache, but would guarantee much more potential for sales ( not saying they would have the sales, just more potential)
Of all 3, I think Nvidia would be most cabable of doing this relatively quickly, but I think all 3 could do it in at least the equivalent development time of a console. All of Apples advantage, plus faster everything else if needed would most likely be the result.
All im saying is Apple ain't doing no miracles, they are doing unquestionably an excellent job in their for now safe relatively tiny slice of market.
 
Become a Patron!
Back
Top