Pacman: “Unpatchable” Flaw Discovered in Apple M1 Chips

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
11,067
Points
83
Researchers with MIT's Computer Science and Artificial Intelligence Laboratory have discovered a flaw in Apple's ARM-based M1 chips, a seemingly significant one that has been described by the researchers as "unpatchable."

Go to post
 
Oh that's no good. Enterprises will have to rethink adoption of Mac book pros.
 
Oh that's no good. Enterprises will have to rethink adoption of Mac book pros.
Please, what enterprise ever considered macs in the past, oh, few decades?

The C-level execs because they can get whatever they want, and the creatives in marketing may get them, but rank and file - nah. No one is putting anything "critical" on a Mac. And this is coming from a OS X fan.
 
Please, what enterprise ever considered macs in the past, oh, few decades?

The C-level execs because they can get whatever they want, and the creatives in marketing may get them, but rank and file - nah. No one is putting anything "critical" on a Mac. And this is coming from a OS X fan.
Plenty of our programmers use MAC's... Why I can't say other than windows doesn't mater when you're doing soap projects.
 
...but Mac's are inherently secure and can't be compromised or get malware or viruses.

That's how "magical" works!

:p
 
Plenty of our programmers use MAC's... Why I can't say other than windows doesn't mater when you're doing soap projects.

I think it must be industry and/or location specific.

In my 20 years across 9 different companies in the medical device industry in New England I have never seen a Mac in a work setting.

Many of those included complex capitaö equipment with software components (thus requiring software development people)

In fact, all but one of them were Dell shops, using nothing but Dell laptops. The other was a HP shop. In all but one of those Dell shops they issued nothing bu Latitude laptops.
 
...but Mac's are inherently secure and can't be compromised or get malware or viruses.

That's how "magical" works!

:p
Apple doesn't seem too spooked. Of course they would seek to downplay it - saying that yes, the hardware vulnerability exists, but the OS has other security systems that prevent it from being an immediate issue.

Rather that's actually true or not, or is the correct approach is the best -- as opposed to, say, issuing a Day 0 mitigation patch - even if it doesn't really do anything, shows the company did something other than say "It's fine". If it does prove to be true, then it's another example of "magic" where having integration between both hardware and software is paying off again. If it doesn't, it another example of Apple being ambiguous and nonchalant about security and updates to the detriment of their users.
 
Apple doesn't seem too spooked. Of course they would seek to downplay it - saying that yes, the hardware vulnerability exists, but the OS has other security systems that prevent it from being an immediate issue.

Rather that's actually true or not, or is the correct approach is the best -- as opposed to, say, issuing a Day 0 mitigation patch - even if it doesn't really do anything, shows the company did something other than say "It's fine". If it does prove to be true, then it's another example of "magic" where having integration between both hardware and software is paying off again. If it doesn't, it another example of Apple being ambiguous and nonchalant about security and updates to the detriment of their users.

Honeslty, I have absolutely zero faith in anything Apple says.

They have a very long history of being completely opaque in the face of vulnerabilities and being slow as molasses when it comes to patching, and that assumes that opposed to what the article states, patching is an option.

If the assessment that the vulnerability is "unpatchable" as the article suggests is true, that would mean that no amount of "other security systems in the OS" would mitigate it either.

I guess it all comes down to who you believe more, and neither are in a particularly trustworthy position. Apple because - well - history, and security researchers because they of course want to make a name for themselves and hype their findings.
 
I think it must be industry and/or location specific.

In my 20 years across 9 different companies in the medical device industry in New England I have never seen a Mac in a work setting.

Many of those included complex capitaö equipment with software components (thus requiring software development people)

In fact, all but one of them were Dell shops, using nothing but Dell laptops. The other was a HP shop. In all but one of those Dell shops they issued nothing bu Latitude laptops.
I’ve worked for 2 it fortune 100s where basically every developer uses a Mac, and I’ve been the oddball using windows. I’ve also worked for not IT companies of similar size where there are no macs to be seen.
 
I’ve worked for 2 it fortune 100s where basically every developer uses a Mac, and I’ve been the oddball using windows. I’ve also worked for not IT companies of similar size where there are no macs to be seen.

A good friend of mine from college is a software developer. He has worked all over the place in research at Boston Children's, the Broad institute and is now at Twitter.

We haven't spent much time talking about build environments lately, but from what I recall he mostly gets to pick his own, and has chosen to develop on PC hardware, but under Linux, not Windows.

I'd imagine the use of Macs in development is really just to get at some of the *nix style development abilities, and because they are shiny status symbols that make them feel important :p

I wonder if that changes at all as Apple moves away from x86. I'd imagine that makes it more difficult to test your own code unless you are developing specifically for the M1/M2 arch.

I don't know enough about it though. I've never developed software.
 
A good friend of mine from college is a software developer. He has worked all over the place in research at Boston Children's, the Broad institute and is now at Twitter.

We haven't spent much time talking about build environments lately, but from what I recall he mostly gets to pick his own, and has chosen to develop on PC hardware, but under Linux, not Windows.

I'd imagine the use of Macs in development is really just to get at some of the *nix style development abilities, and because they are shiny status symbols that make them feel important :p

I wonder if that changes at all as Apple moves away from x86. I'd imagine that makes it more difficult to test your own code unless you are developing specifically for the M1/M2 arch.

I don't know enough about it though. I've never developed software.
For my current job working on cloud services, it doesn’t matter if you do the work on Mac or windows, and the company pretty much sets you up to pick a Mac. Why is that, you might ask?

I had the following choices for my laptop:

Windows: Toshiba Portege 14” with i5 (specific model not disclosed), 16 gb ram, 1tb ssd. Can request a replacement after 30 months.

Mac: 14” MacBook Pro w/ M1 max, 64gb ram, 4tb ssd. Replace after 72 months.

My boss specifically asked that I get a windows box as I’m very familiar with windows and he wanted someone running windows on the team. I got him to approve the executive windows option - it was still a Toshiba, but it was more reasonably equipped.
 
I wonder if that changes at all as Apple moves away from x86. I'd imagine that makes it more difficult to test your own code unless you are developing specifically for the M1/M2 arch.
Seems like most code runs on some remote server in the cloud anymore. Not a ton of local development that's too architecture specific. If you wanted to you could do your development on a HP-48G via telnet and FTP.

Apart from platform-specific development, the biggest exception I can think of is if you wanted to do mobile development. You ~can~ do it on a PC, but it's a heck of a lot easier on a Mac. For almost everything else, and even some cross-platform development, it doesn't really matter what you use as the development platform - just whatever you are comfortable with.

I like the native unix stuff, and it has all the standard unix servers and stuff (SSH, VNC, etc) that make it easy when dealing with other linux stuff. Not to say it can't be done on a PC too, it absolutely can. But I use that as an excuse; the real reason is I've been using Macs for decades, and when I have to get work done I just prefer them to Windows, or even a Linux GUI.
 
I have found I really like WSL. I get all the benefits of Linux without having to connect to a remote host.
 
I have found I really like WSL. I get all the benefits of Linux without having to connect to a remote host.

You know, you can install Linux bare metal locally right? :p

I'm curious, what do you use it for? I've found WSL to be interesting, but I have yet to figure out a motivation for using it instead of hvaing an actual distribution installed locally on bare metal.
 
You know, you can install Linux bare metal locally right? :p

I'm curious, what do you use it for? I've found WSL to be interesting, but I have yet to figure out a motivation for using it instead of hvaing an actual distribution installed locally on bare metal.
I want windows for among other things office, paint, slack, zoom, teams, and whatever I need to install for a client call. During a call, I can seamlessly screen share and move back and forth between windows and Linux as needed. I need wsl to, at a minimum, add PKCS 11 support to my laptop so I can connect to production systems if necessary.

Once i started using wsl a little, I found I wanted to do my real performance work (crazy mix of Bash / Python / Perl / Java / SQL ) locally on the laptop too, to avoid always having to ssh to a Linux VM. The SSH to Linux vm plan works fine at times, but gets annoying to do things like use PyCharm when the VPN hits a batch of 750ms+ latency.
 
Everytime I see "pacman" I'm thinking the Package Manager from Arch-based Linux distros. And it took years for me to stop seeing Namco's character whenever myself and others were talking about Arch's Package Manager.

It still places the yellow pie chart dude in my head.

I never used Arch. I like apt too much to ever switch to a non-Debian based distro.
 
It still places the yellow pie chart dude in my head.
Yyyeeeaaahhh I lied. As a life-long video gamer, it's impossible to not see Namco's Pac-Man. I guess what I meant was that I and others started calling the thing by its full name "Package Manager," and in such cases then yeah I don't see Pac-Man. But most people still call it "pacman" and it's impossible to get away from the character at that point. The GUI version is called "pamac" and that one is also fine for not automatically envisioning the dot-eating yellow circle dude.
 
Become a Patron!
Back
Top