Windows 11 Users Are Reporting Significantly Worse NVMe SSD Performance

Me sucked. 8 sucked. Vista was maligned but I didn't think it was ~that~ horrible, there was just a huge buildup because we had been on XP forever and it felt like a letdown. XP was great for the time - it finally made Windows stable, and 7 was it's best successor.

I went 95, 98, 2k, vista, 7 and then 10 for my main machine, vista was bad at launch, was better after the first big update and given that 7 was an updated vista it was fine.

Seems I'm one of the few people that did not get XP as I was happy with 2k. I did use 8.1 on my laptop as it came preinstalled but have since upgraded it to 10, did not find anything wrong with 8.1.

probably will go 11 if I get an alder lake setup.
 
I went 95, 98, 2k, vista, 7 and then 10 for my main machine, vista was bad at launch, was better after the first big update and given that 7 was an updated vista it was fine.

Seems I'm one of the few people that did not get XP as I was happy with 2k. I did use 8.1 on my laptop as it came preinstalled but have since upgraded it to 10, did not find anything wrong with 8.1.

probably will go 11 if I get an alder lake setup.
2K was a good, solid release. It was more marketed towards corporate use than home/mainstream use so a lot of people never saw it.

8.1 fixed a lot that was troublesome with 8 - seems like they were really trying to push everyone to a touch interface, but it came at the expense of the UI rather than in addition to; which was pretty much how 8.1 "fixed" it.
 
seems like they were really trying to push everyone to a touch interface
Yeah, I think they believed that touchscreens would takeover in every environment and hadn't fully realized that the format just isn't efficient in a lot of them still.

What I hated the most with 10 was how you never knew when the UI would remove or move something in an update unless you really tracked the notes or news bites for one. I mean, sure you could always use the cmd prompt or type it into the search bar but it felt ridiculous that something like the control panel or device manager could move all over the place.
 
I started with dos 6.2.2 never got into windows until windows 95 then I did them all. Amazing the power the command line still holds over the command line. Just look up the command takeown as an example with with pathd is crazy powerful.
 
Amazing the power the command line still holds over the command line
Indeed!

I got away from command line stuff on Windows, but Windows for me is more of a vehicle to play games, and when needed, run the occasional piece of software that won't run anywhere else.

That said, I'm in a linux CLI every day, and do quite a bit in the OS X terminal as well. For ~most~ things, once you know the environment, CLI is going to be faster. That said, I fully admit that GUI does tend to be more intuitive and easier to pick up with little to no instruction.
 
Last edited:
I started with dos 6.2.2
Been so long I can't even remember which version of MS-DOS I was using back in the day. I transitioned into Windows from my old Atari 400 which had been modded/upgraded enough(got in 1980) that it lasted into the late 80s and sometime back then my dad got me a Tandy 1000ex to 'break me in' with using Windows. At that point, I'd already been through several versions of Atari DOS and even gained a little familiarity with CP/M due to what they cloned from it. I used that Tandy into the early 90s and took a break from PCs in general and gaming and didn't get back to it until a Pentium II was handed down to me which has 95 or 98 on it. Not long after I bought and completely rebuilt a P4 system with XP and that thing chugged along for almost ten years until I got a QUAD2CORE which came with Vista. That was my last pre-built, upgraded most parts on it too, and ever since have built my own and fixed other peoples, plus maintained my job. During these phases, I've had to deal with all the subsequent versions of Windows for one reason or another. I'm kind of taking a break with 11 since I'm simply just burned out on OS upgrades between MS/Apple/Android at this point.

Edit: Just to add that I used Atari DOS 1.0 through 4.0
 
I guess I haven't skipped any since 3.0


Me sucked. 8 sucked. Vista was maligned but I didn't think it was ~that~ horrible, there was just a huge buildup because we had been on XP forever and it felt like a letdown. XP was great for the time - it finally made Windows stable, and 7 was it's best successor.

Seems like there have been 3 major generations of Windows... at least with respect to design philosophy. The original through 3.11, 95 through 7, and then 8 through today. 95 was the biggest jump, but it wasn't a great release - something as benign as an improper shutdown corrupting Registry meant routine re-install among other issues. 8 was almost as big a jump as 95.

10 had promise - it was supposed to be the XP of the 3rd generation of windows - something that finally brought stability to the design. But they could never seem to leave the design alone and kept changing the layout. I hated it for that.

That said, throughout all of these iterations, Windows hasn't really changed that much from the front face. An OS shouldn't do much other than facilitate applications to run, and apart from that it should stay out of the way. The best releases in my opinion have been the ones that do that the best.
You left out windows 2000 which is IMO the most important OS as it cemented the future of windows with the NT kernel.

Vista was bad, really really bad, but the worst part is that it probably came too early. Vista sucked in large part because it needed much more ram than the 512mb that it required. At the time 512 was about the most you could have in a PC although 1gb came out soon after.

But even with 1gb+ Windows 7 feeled much faster and responsive. I recall upgrading dozens of vista PCs to win7 and the performance difference was noticeable.

IMO win 8.1 was actually quite nice, we had several pcs at work with it. it's just that windows 7 was as good or even better.
 
I started with dos 6.2.2 never got into windows until windows 95 then I did them all. Amazing the power the command line still holds over the command line. Just look up the command takeown as an example with with pathd is crazy powerful.
I recall 6.2.2 being very popular back in the day, but that was not my first.
 
You left out windows 2000 which is IMO the most important OS as it cemented the future of windows with the NT kernel.
And after a service pack (or two?), could game. That cemented the future of the NT branch amongst enthusiasts as well!

Vista was bad, really really bad, but the worst part is that it probably came too early.
As with Windows 98 to 95, XP to... Me?, and more recently 10 to 8 / 8.1, Vista was chock full of features and a very different underlying approach that just flat out broke many things, on top of all that just being plain slow for the first few service packs.

Vista also brought the first mainstream 64bit, that is, x86-64 and not IA-64, mainstream desktop Windows OS. Once patched to parity with Windows 7, there was little to differentiate between the two for average users.

And I still lament the demise of gadgets as well as the sidebar. With ultrawide multi-monitor setups, sidebar apps would be useful today, most especially if Microsoft continued to evolve the API to make interfacing gadgets with more sources of information possible.

But even with 1gb+ Windows 7 feeled much faster and responsive. I recall upgrading dozens of vista PCs to win7 and the performance difference was noticeable.
Patch for patch, at least the last Vista patches next to the same patches for Windows 7, they should be near indistinguishable. A fresh install of a late-vintage Vista build wasn't that bad IIRC, Microsoft just had to cut ties to the Vista name given the baggage it carried. I guess what I'm saying is that there's no technical reason they introduced Windows 7, it was all marketing.

IMO win 8.1 was actually quite nice, we had several pcs at work with it. it's just that windows 7 was as good or even better.
I ran 8 and then 8.1 as upgraded from Windows 7, but I had a shell replacement running too when I started using 8 - I never saw most of what the common misgivings with Windows 8.1 were. To me, they might have just as well been Windows 7, and indeed, they really were, with most improvements rather much under the hood and generally unneeded for most.


Overall, the real stinker was Me, or the 'Millenium Edition' for those not aware. Based off of Windows 98 SE (that is, 'Second Edition'), which was a pretty good base for a consumer OS at the time, Microsoft over-bloated it and managed to ruin what was otherwise a good thing.
 
You left out windows 2000 which is IMO the most important OS as it cemented the future of windows with the NT kernel.
Too true.
Vista sucked in large part because it needed much more ram than the 512mb that it required. At the time 512 was about the most you could have in a PC although 1gb came out soon after.
Mine had 4 GB. I later upgraded it to 8 GB. I'd bought a floor model back in the day for around $700-$800 or something along those lines. This thing was a beast. The 640 GB HDD was actually 2x 320 GB drives in RAID0.

 
You left out windows 2000
Well I wasn’t trying to list them all, just kinda bookend the generational breaks and hit the highlights. But you are right, 2K had big changes under the hood.
 
2000 was a server class os not a desktop class os. Yea a lot of us ended up running it... myself included .. but it was never intended for desktops or really workstations just ended up in those roles as well. I wouldn't say it was left out but does warrant a special mention. ;)
 
I never used SE (that I can remember) but definitely have to agree on the rest.
 
This obviously isn't ideal, but honestly, at the speeds these things currently operate, I doubt anyone would even notice +/- 100MB/s without benchmarking.

The term "significantly" in the title seems maybe a bit hyperbolic.
 
That said, throughout all of these iterations, Windows hasn't really changed that much from the front face. An OS shouldn't do much other than facilitate applications to run, and apart from that it should stay out of the way. The best releases in my opinion have been the ones that do that the best.

This right here. I hate the modern era when operating systems want to be a complete ecosystem complete with cloud features.

I just want an operating system with absolutely nothing else provided, where I make the decisions of what software to install.

I don't want a freaking weather app, or an email client or even a browser preinstalled. I want the OS to contain nothing. It should be an empty canvas consisting simply of a menu structure and system settings on which I can build.
 
This right here. I hate the modern era when operating systems want to be a complete ecosystem complete with cloud features.

I just want an operating system with absolutely nothing else provided, where I make the decisions of what software to install.

I don't want a freaking weather app, or an email client or even a browser preinstalled. I want the OS to contain nothing. It should be an empty canvas consisting simply of a menu structure and system settings on which I can build.

Sounds like a Linux Distro is in your future
 
Become a Patron!
Back
Top