NVIDIA Drops DisplayHDR 1000 Requirement from G-SYNC Ultimate Certification Process

Tsing

The FPS Review
Staff member
Joined
May 6, 2019
Messages
12,575
Points
113
nvidia-g-sync-chip-1024x576.jpg
Image: NVIDIA



NVIDIA’s G-SYNC Ultimate certification used to be reserved for gaming monitors that are capable of at least 1,000 nits of brightness (i.e., VESA DisplayHDR 1000), but that’s no longer the case. As spotted by PC Monitors, that requirement was removed sometime in November and replaced with something much more vague: “Lifelike HDR.”



Old: Features the latest NVIDIA G-SYNC processors to deliver the very best gaming experience, including HDR, over 1000-nits brightness, stunning contrast, cinematic color, and ultra-low latency gameplay.



Revised: Features the top NVIDIA G-SYNC processors to deliver the very best gaming experience, including lifelike HDR, stunning...

Continue reading...


 
Probably because otherwise they would not have any? I'm not sure, but HDR 1000 seems to be pretty rare in pc monitors with high refresh rates and other gamer related stuff.
 
Probably because otherwise they would not have any? I'm not sure, but HDR 1000 seems to be pretty rare in pc monitors with high refresh rates and other gamer related stuff.
And almost impossible to achieve without stout FALD arrays. Even OLED isn't hitting "HDR 1000". Which means no way to implement the technology without serious artifact potential.

The bigger problem of course is that 'HDR x' has very little overall meaning. "HDR 400" basically means "not HDR", "HDR 600" translates to "not the worst HDR", and "HDR 1000" to "expensive FALD array and all the assorted downsides".

And none of it really translates into what we really want: a monitor

You know, one that outputs the signal it is given as transparently as possible by default.
 
And almost impossible to achieve without stout FALD arrays. Even OLED isn't hitting "HDR 1000". Which means no way to implement the technology without serious artifact potential.

The bigger problem of course is that 'HDR x' has very little overall meaning. "HDR 400" basically means "not HDR", "HDR 600" translates to "not the worst HDR", and "HDR 1000" to "expensive FALD array and all the assorted downsides".

And none of it really translates into what we really want: a monitor

You know, one that outputs the signal it is given as transparently as possible by default.

My Acer CG437K seems to be able to hit the required brightness, but the picture is **** when used. HDR on this monitor wasn't the foremost thing on my mind fortunately, because its not good at it. The display just gets so bright you can't look at it, but apparently it lacks the local dimming required to actually make HDR worth a ****. Fortunately, the image is great without HDR and I don't have too many complaints. For working and gaming, it's awesome.
 
Become a Patron!
Back
Top