Acer Releasing 49-Inch Ultrawide VA Monitor with 2,048 Local Dimming Zones and DisplayHDR 2000 Support

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,871
Points
113
acer-el491crg9-monitor-1024x576.jpg




It wasn’t too long ago that Taobao unveiled the complete specifications for Samsung’s updated Odyssey G9 monitor, but the Chinese electronics retailer has now listed Acer’s equivalent: the EI491CRG9, a 49-inch ultrawide display that appears to leverage the Odyssey’s exact same 5120 x 1440 VA panel with 240 Hz refresh rate and Mini LED backlight with 2,048 local dimming zones. Acer’s EI491CRG9 also boasts NVIDIA G-SYNC compatibility and VESA DisplayHDR 2000 certification, which means that it can achieve intense brightness levels of up to 2,000 cd/m2 with HDR content. U.S. pricing is unclear, but the display will cost 14,999 RMB ($2,300) in China...

Continue reading...


 
Relative to other monitor pricing, $2300 would be a steal.

That said, I think 2000 nits is way too bright for a monitor.
 
That said, I think 2000 nits is way too bright for a monitor.
Thankfully, that's not what the monitor is actually capable of; HDR2000 means that it can hit that briefly and only on part of the screen.

Still, I'd rather not be in front of the screen when it tries, at least while gaming. I get the utility of the effect for cinematics of course. Just don't want my opponents (or my teammates' 😵) flashbangs to be that affective!
 
I think the monitor HDR standard is somewhat rediculous.

Yes, the 2000 refers to peak nits. But I don't think that's the right thing to be measuring. It should be closer to contrast ratio - the difference between a screens blackest black level and brightest white level as shown simultaneously on a picture. It's not quite the same as Contrast Ratio though - which is the darkest Black vs the brightest White, but for the CR measurement those two values don't have to be simultaneous - you can go from an all black screen to an all white screen. HDR level should measure a screen with both in the image.

2000 nits is retardedly bright as it is... so maybe that kinda makes this moot. But let's step back a bit. Let's make a sample image -- say a flare - or a bright star against totally black empty space. Something where you have just a small portion of the screen at max brightness and the rest at black black, and if you want to be a stickler, you could have the inverse image -- one majority white, with a small portion of the screen black.

An OLED is going to be able to do that, even with just HDR400. The difference between it's darkest Black and brightest White, combined with the fact that it doesn't need FALD in order to prevent the rest of the screen from washing out to provide that brightest white - that's going to be a totally different looking picture than, say, an IPS with edge lighting, notorious backlight bleed, and no FALD zones.

I would say, in that race, the biggest advantage OLED brings isn't the perfect black that it offers, but rather that it doesn't need FALD, it has perfect per-pixel brightness control. It doesn't really matter what the peak brightness is in that case. Whereas in the case of LCDs with backlighting, you need just overwhelming brightness to overcome that lack of control and whitewash (for lack of a better term) the fact that your backlight bleed is washing out the contrast ratio in some areas of the screen.
 
I would say, in that race, the biggest advantage OLED brings isn't the perfect black that it offers, but rather that it doesn't need FALD, it has perfect per-pixel brightness control.
That's the alpha and the omega right there.

It's why I have zero interest in FALD, as it literally means 'no per-pixel backlighting', and I don't do anything on a desktop monitor where I'd prefer that to uniform backlighting (and contrast). Even with peak lower brightness.
 
I think the monitor HDR standard is somewhat rediculous.

Yes, the 2000 refers to peak nits. But I don't think that's the right thing to be measuring. It should be closer to contrast ratio - the difference between a screens blackest black level and brightest white level as shown simultaneously on a picture. It's not quite the same as Contrast Ratio though - which is the darkest Black vs the brightest White, but for the CR measurement those two values don't have to be simultaneous - you can go from an all black screen to an all white screen. HDR level should measure a screen with both in the image.

2000 nits is retardedly bright as it is... so maybe that kinda makes this moot. But let's step back a bit. Let's make a sample image -- say a flare - or a bright star against totally black empty space. Something where you have just a small portion of the screen at max brightness and the rest at black black, and if you want to be a stickler, you could have the inverse image -- one majority white, with a small portion of the screen black.

An OLED is going to be able to do that, even with just HDR400. The difference between it's darkest Black and brightest White, combined with the fact that it doesn't need FALD in order to prevent the rest of the screen from washing out to provide that brightest white - that's going to be a totally different looking picture than, say, an IPS with edge lighting, notorious backlight bleed, and no FALD zones.

I would say, in that race, the biggest advantage OLED brings isn't the perfect black that it offers, but rather that it doesn't need FALD, it has perfect per-pixel brightness control. It doesn't really matter what the peak brightness is in that case. Whereas in the case of LCDs with backlighting, you need just overwhelming brightness to overcome that lack of control and whitewash (for lack of a better term) the fact that your backlight bleed is washing out the contrast ratio in some areas of the screen.

Vincent from HDTVTest did a good video on brightness, showing how things that you wouldn't really think are that bright are actually pretty bright in terms of nits.

 
Become a Patron!
Back
Top