I think the monitor HDR standard is somewhat rediculous.
Yes, the 2000 refers to peak nits. But I don't think that's the right thing to be measuring. It should be closer to contrast ratio - the difference between a screens blackest black level and brightest white level as shown simultaneously on a picture. It's not quite the same as Contrast Ratio though - which is the darkest Black vs the brightest White, but for the CR measurement those two values don't have to be simultaneous - you can go from an all black screen to an all white screen. HDR level should measure a screen with both in the image.
2000 nits is retardedly bright as it is... so maybe that kinda makes this moot. But let's step back a bit. Let's make a sample image -- say a flare - or a bright star against totally black empty space. Something where you have just a small portion of the screen at max brightness and the rest at black black, and if you want to be a stickler, you could have the inverse image -- one majority white, with a small portion of the screen black.
An OLED is going to be able to do that, even with just HDR400. The difference between it's darkest Black and brightest White, combined with the fact that it doesn't need FALD in order to prevent the rest of the screen from washing out to provide that brightest white - that's going to be a totally different looking picture than, say, an IPS with edge lighting, notorious backlight bleed, and no FALD zones.
I would say, in that race, the biggest advantage OLED brings isn't the perfect black that it offers, but rather that it doesn't need FALD, it has perfect per-pixel brightness control. It doesn't really matter what the peak brightness is in that case. Whereas in the case of LCDs with backlighting, you need just overwhelming brightness to overcome that lack of control and whitewash (for lack of a better term) the fact that your backlight bleed is washing out the contrast ratio in some areas of the screen.