I'm disappointed they don't make native 10-bit or even 12-bit and BT.2020 at least some level of certification and branding, even if it was an entire different tier like DisplayHDR 1200 or "DisplayHDR Ultimate" or something. Having larger gamuts be true strict supersets of sRGB requires higher bit depth, and dithering was never good nor "standard" for PC displays. It's regrettable that even from a simple branding perspective there isn't a bit more pressure to push the industry forward harder now that we've finally got a major improvements within reach and can see the end of the road for the basic four corners of displays.
The specs here effectively lock out OLED as a PC HDR display, since OLED cannot pass the full screen flash requirement though that rarely if ever happens in HDR material. They also do not note what size patterns are required to use for testing (or if companies can choose their own size, and local dimming systems often perform better with certain sizes), and what the time limit is for the long duration. It would be better if they had chosen color volume as a metric over gamut coverage, but that's still up for more debate right now on how to effectively measure it.
Current OLEDs do not burn in. They have display retention in some cases, but not burn in. The difference is that retention is easily resolved with screen savers and leaving the display off for several hours, or in extreme cases displaying an inverted image at high brightness for a short period.
Yes, they can burn in. While there are several things you can do to keep it from happening, its still quite possible. Also negative image at high brightness corrects it but at the expensive of overall brightness and shorter lifespan of the screen.
Its also possible a person throws remote at screen and it shatters. Its not something you even consider because it does not happen much. People get paranoid because of a report they saw and assume its something common.
Every OLED screen newer the screen auto turns off anyways.
How about personal experience with burn-in on several generations of a device? If you're in doubt, I can show you photos of my own S8+, my brother's S7, and my dad's S6 Edge+ ALL with burn-in (to various degrees of course).
I don't get this denialism I keep seeing on the Internet of OLED's inherent issues for the last few years. How about trusting people's experiences when they actually get burn-in and report it? It's not an isolated issue - almost every Samsung Galaxy flagship that I've seen over the years with family, friends have various degrees of burn-in - some of them extreme (keyboard). And not only the old devices.
Burn-in for OLED is a very real problem, and one of the main reasons it hasn't yet dominated the professional PC monitor segment. Dell has such a product on the market, but with numerous aids and features to help you keep from burning it in.
Considering manufacturers implement a lot of features to prevent it means its still very relevant and very possible which multiple tests, users experience has shown. Burn in and IR are just something you have to deal with when using emissive displays of any kind.
I've had mine on for a month now, play games on it with steam link, watch netflix, etc. Zero worries about it. People just get scared cause they hear a buzzword. No different than someone saying a certain game sucks don't play it...while majority have zero issues.
While anecdotal, I've had two displays from two manufacturers burn in on me. Both were taken back and replace under warranty. I then sold them and got an LCD. Picture quality is amazing but not worth it to me. It's not a buzzword. ALL emissive displays have this issue. Either you're willing to deal with whats required to preserve the display or your not. I'm in the latter. I like to game for hours on end or leave my TV running with the news on it so I can watch/listen while doing other things. OLED just aint a good solution for me. The screens have gotten much better no doubt but the issue is still there, it can still happen.
Yes, they burn in. Please stop repeating the myth that they don't. When you own or are in the immediate vicinity of several devices with OLED displays, and you see that all of them have burn-in that does not go away, you cannot conclude that this is merely image retention.
That is not really a issue though. Unless you plan on leaving display on all the time. I've playing Dying light 6+ hours and not worries on my OLED. I doubt anyone has any issue of note except stores that run it on a loop display mode 24/7
"The specs here effectively lock out OLED as a PC HDR display, since OLED cannot pass the full screen flash requirement though that rarely if ever happens in HDR material."
In fairness to the VESA, they are specifically stating that this standard is only meant to apple to LCDs. OLED displays will require a different specification due to their different properties.
BTW, have you had a chance to read the test specification itself?
However, if this rating system gains traction & makes it to the big boxes & marketing materials, I do expect places like Best Buy to say "this is the best TV you can get with its HDR1000 rating!"
Meanwhile, OLED arguably has better picture quality, but it won't have the "HDR1000" rating. So consumers will be led to believe it can't be better if it doesn't have the "HDR1000" rating. You can't simply trust employees to know/explain the difference, and it doesn't help consumers.
Granted, the above applies to OLED specifically; I do like VESA is defining the specs more for PC monitors. However, you can definitely see where it adds confusion to the big box stores if it is added for all of the TVs.
Realistically it's highly unlikely you're going to see TVs using this certification process. This will likely only be used for PC displays and laptops, where OLED isn't a factor yet.
Unfortunately I can't download the test software as I'm not a VESA member, but some elements of the test process stand out.
"Luminance is measured at the screen’s center once per minute for 30 measurements, over 30 minutes, using the same panel. The first measurement should be obtained within 5 seconds of when the white box begins to display."
That 5 second delay is a big thing. One issue with HDR is that highlights can be fleeting, as some HDR films have things like fireworks and other fast events that last for fractions of a second but are meant to be really bright. While OLEDs handle this with near instant pixel response times, some LCD backlights take 3-4 seconds to ramp up, making those HDR highlights impossible to render. By giving a display 5 seconds to ramp up to full white, you might pass a test, but you won't be effective with content at all. There is a rise time test later in the document, but it only goes to 90% starting at 10%, and allows for 8 frames to work which is longer than some HDR highlights are. It's better than nothing, but it's not ideal.
It also requires 10-bit panels or pipelines, but only 8-bit DACs at the panel level. HDR signals range from 0-1023, but the final output levels might only be 0-255. So while HDR content should not have much if any dithering visible in gradients, it is more likely to be visible if you're using 8-bits at the end of the pipeline. I wish for the higher-end 1000 requirement you had to have 10-bit DACs so you'd know it's a true 10-bit pipeline all the way through.
I also don't like that they use metadata of 10,000 nits for MaxCLL and Mastering Luminance, since no HDR content uses that. All the mastering displays right now are 1000 nits (Sony BVM) or 4000 nits (Dolby Pulsar), and 10,000 is something with no real world examples. Displays might react differently to this than they would real world content because of it, so it makes the data less useful. The corner box test uses completely different metadata, of 600 nits or 96 nits, which also doesn't exist in real world content right now. This is more simulating dynamic metadata, but at the same time using HDR10 which doesn't have dynamic metadata.
It's better to have a standard than to not have one, but the issue is when the test parameters for the standard don't align with real-world content that well. At least they're requiring local dimming but in the end this is likely to confuse people since companies will still label monitors as HDR when they're only doing 250 nits.
Can you explain how their tests verify the display's contrast ratio? In both the corner box and the tunnel test, they only seem to be measuring black luminance, not the whites. They say the corner boxes are coded to display at 600 cd/㎡. Does this mean they actually do display this brightness? And how does a 95 cd/㎡ border and 0.10 cd/㎡ blacks in the tunnel test equate to the 4000:1 contrast ratio they claim on page 22?
It does not, because this standard is still optional, the OLED manufacturer can just go for HDR10 or Dolby Vision or even both, and I still see Dolby Vision is better than the rest.
HDR10 for me is the lowest HDR standard for me. and looking that this DisplayHDR requires HDR10, then I find it okay also, better than just saying "HDR" in the spec. without specifying anything else.
While not all that I would want it to be, give me a 32" 1440p DisplayHDR-600 monitor and I'll happily fork $400-500. Price it at $700+ and it's DOA. There's a real risk manufacturers will keep pretending these technologies are amazing and cutting edge. We all need to stop pretending that 10bit/DCI P3/600nits are top notch. These have been achievable for a while now. It's time to accept that spec as mid-range.
It's rather clear the meat of the market is deliberately designed at HDR 600. The 1000 seems pointless to the vast majority of consumers. If they had required the BT.2020 for the HDR 1000 tier to separate from 600 it would make it justifying the HDR 1000 plausible. Without that it seems more like a peeing contest between friends over nothing of added value.
Yeah, I agree. It makes it easier for a layman to get an immediate point of reference and knowledge of what's better with a simple certification label when comparing TVs in store, under bad lighting conditions and poorly calibrated showroom colors.
This is not an HDR "standard", like HDR10, HLG, Dolby Vision, etc. This is a display certification specification. DisplayHDR-400 is certainly nothing interesting, but DisplayHDR-600 definitely brings a good amount of requirements. Why would you think that's wrong?
Considering I'm forced to run 0-10 brightness levels on the majority of ~200 nits monitors today to avoid boiling my eyes in their sockets I confess struggling to imagine 600-1000 nit levels.
Running my Dell UP2516D at anything over 20 percent brightness is over bearing . I'd love some hdr but I don't know how 1000 nits up close would feel on pc
I'm not in desperate need of HDR on my desktop as it stands but having access to the technology without putting my eyes under undue strain would be nice.
I suppose I'm just wary about HDR and the general focus on lifelike accuracy and representation in imaging potentially taking precedence over my own needs/preferences in displays over the long run.
This is fucking pathetic, as should be expected of PC centric displays. Phones and TVs get insisted on high spec, meanwhile this is just... nothing. This is literally nothing, they set it as low as possible just so everyone can make marketing claims.
You know what? The PC deserves to die, the OEMs don't give have a shit about competition or price or anything but making a quick buck. I hope something new, some different platform, allows professional work on it so the PC can die completely. I'm tired of buying overpriced, lazily produced equipment just to do my job. Maybe AR glasses will allow you to do this. Sure, hook up your Hololens wireless to some stick that's just a battery and compute power sitting in your pocket. That sounds awesome.
They have iPhones and Android *smartphones* that cost more than $1000, which people upgrade on a ~2 year or so cycle, and you're telling me that workhorse PCs that enable you to do your job and make money are overpriced. Wow, I hope drugs are involved in that thought process.
I read somewhere that the base 400 standard only requires dimming on a single panel (rather than array dimming or per-pixel lighting), but it didn't say anything about the 600 and 1000 standards.
If the 600 standard specified a minimum of 16 lighting panels, and the 1000 standard specified a minimum of 384, then this would be a valuable addition.
Can anyone confirm that the different standards differentiate on dimming levels/types?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
zanon - Monday, December 11, 2017 - link
I'm disappointed they don't make native 10-bit or even 12-bit and BT.2020 at least some level of certification and branding, even if it was an entire different tier like DisplayHDR 1200 or "DisplayHDR Ultimate" or something. Having larger gamuts be true strict supersets of sRGB requires higher bit depth, and dithering was never good nor "standard" for PC displays. It's regrettable that even from a simple branding perspective there isn't a bit more pressure to push the industry forward harder now that we've finally got a major improvements within reach and can see the end of the road for the basic four corners of displays.Gothmoth - Monday, December 11, 2017 - link
yeah it´ just another certification logo that nobody really needs....ddrіver - Monday, December 11, 2017 - link
VESA is trying to stay relevant. Next standard: "VESA CERTIFIED DisplayDR 100ish".A5 - Tuesday, December 12, 2017 - link
I suspect there will be a future high-end level that is 12-bit and 90+% on Rec.2020.Probably an intermediate level that is 99% DCI-P3 and 10-bit only, too.
cheinonen - Monday, December 11, 2017 - link
The specs here effectively lock out OLED as a PC HDR display, since OLED cannot pass the full screen flash requirement though that rarely if ever happens in HDR material. They also do not note what size patterns are required to use for testing (or if companies can choose their own size, and local dimming systems often perform better with certain sizes), and what the time limit is for the long duration. It would be better if they had chosen color volume as a metric over gamut coverage, but that's still up for more debate right now on how to effectively measure it.haukionkannel - Monday, December 11, 2017 - link
Well at least there now is one more standard. 99% do todays hdr Displays Are pure crap... They just can take in hdr content, not much show it...Chugworth - Monday, December 11, 2017 - link
The current OLED technology would be terrible for PCs anyway due to its high risk of burn-in.jordanclock - Monday, December 11, 2017 - link
Current OLEDs do not burn in. They have display retention in some cases, but not burn in. The difference is that retention is easily resolved with screen savers and leaving the display off for several hours, or in extreme cases displaying an inverted image at high brightness for a short period.Manch - Tuesday, December 12, 2017 - link
Yes, they can burn in. While there are several things you can do to keep it from happening, its still quite possible. Also negative image at high brightness corrects it but at the expensive of overall brightness and shorter lifespan of the screen.imaheadcase - Tuesday, December 12, 2017 - link
Its also possible a person throws remote at screen and it shatters. Its not something you even consider because it does not happen much. People get paranoid because of a report they saw and assume its something common.Every OLED screen newer the screen auto turns off anyways.
yasamoka - Thursday, December 14, 2017 - link
How about personal experience with burn-in on several generations of a device? If you're in doubt, I can show you photos of my own S8+, my brother's S7, and my dad's S6 Edge+ ALL with burn-in (to various degrees of course).I don't get this denialism I keep seeing on the Internet of OLED's inherent issues for the last few years. How about trusting people's experiences when they actually get burn-in and report it? It's not an isolated issue - almost every Samsung Galaxy flagship that I've seen over the years with family, friends have various degrees of burn-in - some of them extreme (keyboard). And not only the old devices.
mode_13h - Friday, December 15, 2017 - link
Burn-in for OLED is a very real problem, and one of the main reasons it hasn't yet dominated the professional PC monitor segment. Dell has such a product on the market, but with numerous aids and features to help you keep from burning it in.Manch - Friday, December 15, 2017 - link
I love the arguments in here.Burn in is an issue.
No it's not
Argument refuted
Counter argument accepts original premise but dismisses it while still arguing against it.
LOL
Manch - Friday, December 15, 2017 - link
Considering manufacturers implement a lot of features to prevent it means its still very relevant and very possible which multiple tests, users experience has shown. Burn in and IR are just something you have to deal with when using emissive displays of any kind.StevoLincolnite - Tuesday, December 12, 2017 - link
Who the hell wants to go through all that trouble?Manch - Tuesday, December 12, 2017 - link
Yeah, OLED is a no go for me. Image is great but not worth the IR issues or burn in.imaheadcase - Tuesday, December 12, 2017 - link
I've had mine on for a month now, play games on it with steam link, watch netflix, etc. Zero worries about it. People just get scared cause they hear a buzzword. No different than someone saying a certain game sucks don't play it...while majority have zero issues.Manch - Friday, December 15, 2017 - link
While anecdotal, I've had two displays from two manufacturers burn in on me. Both were taken back and replace under warranty. I then sold them and got an LCD. Picture quality is amazing but not worth it to me. It's not a buzzword. ALL emissive displays have this issue. Either you're willing to deal with whats required to preserve the display or your not. I'm in the latter. I like to game for hours on end or leave my TV running with the news on it so I can watch/listen while doing other things. OLED just aint a good solution for me. The screens have gotten much better no doubt but the issue is still there, it can still happen.06GTOSC - Tuesday, January 23, 2018 - link
I'm still using a plasma TV from 2010. No burn in. Haven't used it as a PC display either. But plenty of gaming.yasamoka - Thursday, December 14, 2017 - link
Yes, they burn in. Please stop repeating the myth that they don't. When you own or are in the immediate vicinity of several devices with OLED displays, and you see that all of them have burn-in that does not go away, you cannot conclude that this is merely image retention.piroroadkill - Friday, December 15, 2017 - link
Correct. I don't know why people pretend like degradation of OLEDs does not occur, when it absolutely does, and is part of the design.imaheadcase - Tuesday, December 12, 2017 - link
That is not really a issue though. Unless you plan on leaving display on all the time. I've playing Dying light 6+ hours and not worries on my OLED. I doubt anyone has any issue of note except stores that run it on a loop display mode 24/7Ryan Smith - Monday, December 11, 2017 - link
"The specs here effectively lock out OLED as a PC HDR display, since OLED cannot pass the full screen flash requirement though that rarely if ever happens in HDR material."In fairness to the VESA, they are specifically stating that this standard is only meant to apple to LCDs. OLED displays will require a different specification due to their different properties.
BTW, have you had a chance to read the test specification itself?
romrunning - Monday, December 11, 2017 - link
However, if this rating system gains traction & makes it to the big boxes & marketing materials, I do expect places like Best Buy to say "this is the best TV you can get with its HDR1000 rating!"Meanwhile, OLED arguably has better picture quality, but it won't have the "HDR1000" rating. So consumers will be led to believe it can't be better if it doesn't have the "HDR1000" rating. You can't simply trust employees to know/explain the difference, and it doesn't help consumers.
Granted, the above applies to OLED specifically; I do like VESA is defining the specs more for PC monitors. However, you can definitely see where it adds confusion to the big box stores if it is added for all of the TVs.
Ryan Smith - Monday, December 11, 2017 - link
Realistically it's highly unlikely you're going to see TVs using this certification process. This will likely only be used for PC displays and laptops, where OLED isn't a factor yet.cheinonen - Tuesday, December 12, 2017 - link
Unfortunately I can't download the test software as I'm not a VESA member, but some elements of the test process stand out."Luminance is measured at the screen’s center once per minute for 30 measurements, over 30 minutes, using the same panel. The first measurement should be obtained within 5 seconds of when the white box begins to display."
That 5 second delay is a big thing. One issue with HDR is that highlights can be fleeting, as some HDR films have things like fireworks and other fast events that last for fractions of a second but are meant to be really bright. While OLEDs handle this with near instant pixel response times, some LCD backlights take 3-4 seconds to ramp up, making those HDR highlights impossible to render. By giving a display 5 seconds to ramp up to full white, you might pass a test, but you won't be effective with content at all. There is a rise time test later in the document, but it only goes to 90% starting at 10%, and allows for 8 frames to work which is longer than some HDR highlights are. It's better than nothing, but it's not ideal.
It also requires 10-bit panels or pipelines, but only 8-bit DACs at the panel level. HDR signals range from 0-1023, but the final output levels might only be 0-255. So while HDR content should not have much if any dithering visible in gradients, it is more likely to be visible if you're using 8-bits at the end of the pipeline. I wish for the higher-end 1000 requirement you had to have 10-bit DACs so you'd know it's a true 10-bit pipeline all the way through.
I also don't like that they use metadata of 10,000 nits for MaxCLL and Mastering Luminance, since no HDR content uses that. All the mastering displays right now are 1000 nits (Sony BVM) or 4000 nits (Dolby Pulsar), and 10,000 is something with no real world examples. Displays might react differently to this than they would real world content because of it, so it makes the data less useful. The corner box test uses completely different metadata, of 600 nits or 96 nits, which also doesn't exist in real world content right now. This is more simulating dynamic metadata, but at the same time using HDR10 which doesn't have dynamic metadata.
It's better to have a standard than to not have one, but the issue is when the test parameters for the standard don't align with real-world content that well. At least they're requiring local dimming but in the end this is likely to confuse people since companies will still label monitors as HDR when they're only doing 250 nits.
alexdi - Thursday, December 21, 2017 - link
Can you explain how their tests verify the display's contrast ratio? In both the corner box and the tunnel test, they only seem to be measuring black luminance, not the whites. They say the corner boxes are coded to display at 600 cd/㎡. Does this mean they actually do display this brightness? And how does a 95 cd/㎡ border and 0.10 cd/㎡ blacks in the tunnel test equate to the 4000:1 contrast ratio they claim on page 22?Xajel - Tuesday, December 12, 2017 - link
It does not, because this standard is still optional, the OLED manufacturer can just go for HDR10 or Dolby Vision or even both, and I still see Dolby Vision is better than the rest.HDR10 for me is the lowest HDR standard for me. and looking that this DisplayHDR requires HDR10, then I find it okay also, better than just saying "HDR" in the spec. without specifying anything else.
euskalzabe - Monday, December 11, 2017 - link
While not all that I would want it to be, give me a 32" 1440p DisplayHDR-600 monitor and I'll happily fork $400-500. Price it at $700+ and it's DOA. There's a real risk manufacturers will keep pretending these technologies are amazing and cutting edge. We all need to stop pretending that 10bit/DCI P3/600nits are top notch. These have been achievable for a while now. It's time to accept that spec as mid-range.mdriftmeyer - Monday, December 11, 2017 - link
It's rather clear the meat of the market is deliberately designed at HDR 600. The 1000 seems pointless to the vast majority of consumers. If they had required the BT.2020 for the HDR 1000 tier to separate from 600 it would make it justifying the HDR 1000 plausible. Without that it seems more like a peeing contest between friends over nothing of added value.Gothmoth - Monday, December 11, 2017 - link
wow... another HDR "standard"..... lol.... and one that sux.skavi - Monday, December 11, 2017 - link
This is a pretty decent hardware standard IMO. what's your issue with it?JoeyJoJo123 - Monday, December 11, 2017 - link
Yeah, I agree. It makes it easier for a layman to get an immediate point of reference and knowledge of what's better with a simple certification label when comparing TVs in store, under bad lighting conditions and poorly calibrated showroom colors.euskalzabe - Monday, December 11, 2017 - link
This is not an HDR "standard", like HDR10, HLG, Dolby Vision, etc. This is a display certification specification. DisplayHDR-400 is certainly nothing interesting, but DisplayHDR-600 definitely brings a good amount of requirements. Why would you think that's wrong?Alistair - Monday, December 11, 2017 - link
HDR 600 is not so bad. Would love to have a monitor like that!Exodite - Monday, December 11, 2017 - link
Considering I'm forced to run 0-10 brightness levels on the majority of ~200 nits monitors today to avoid boiling my eyes in their sockets I confess struggling to imagine 600-1000 nit levels.milkywayer - Monday, December 11, 2017 - link
Ditto.Running my Dell UP2516D at anything over 20 percent brightness is over bearing . I'd love some hdr but I don't know how 1000 nits up close would feel on pc
futrtrubl - Tuesday, December 12, 2017 - link
Remember that's only for small areas or for small times. The 350 long term brightness for the 600 level seems reasonable to me.Exodite - Wednesday, December 13, 2017 - link
That's a good point, thanks!I'm not in desperate need of HDR on my desktop as it stands but having access to the technology without putting my eyes under undue strain would be nice.
I suppose I'm just wary about HDR and the general focus on lifelike accuracy and representation in imaging potentially taking precedence over my own needs/preferences in displays over the long run.
invinciblegod - Monday, December 11, 2017 - link
They need a standard for projectors, since projectors have a much lower luminance value. I'm not sure how HDR would work for projectors though.Frenetic Pony - Monday, December 11, 2017 - link
This is fucking pathetic, as should be expected of PC centric displays. Phones and TVs get insisted on high spec, meanwhile this is just... nothing. This is literally nothing, they set it as low as possible just so everyone can make marketing claims.You know what? The PC deserves to die, the OEMs don't give have a shit about competition or price or anything but making a quick buck. I hope something new, some different platform, allows professional work on it so the PC can die completely. I'm tired of buying overpriced, lazily produced equipment just to do my job. Maybe AR glasses will allow you to do this. Sure, hook up your Hololens wireless to some stick that's just a battery and compute power sitting in your pocket. That sounds awesome.
Alexvrb - Monday, December 11, 2017 - link
They have iPhones and Android *smartphones* that cost more than $1000, which people upgrade on a ~2 year or so cycle, and you're telling me that workhorse PCs that enable you to do your job and make money are overpriced. Wow, I hope drugs are involved in that thought process.milkywayer - Monday, December 11, 2017 - link
I want some of whatever you been smokingR3MF - Friday, December 15, 2017 - link
I read somewhere that the base 400 standard only requires dimming on a single panel (rather than array dimming or per-pixel lighting), but it didn't say anything about the 600 and 1000 standards.If the 600 standard specified a minimum of 16 lighting panels, and the 1000 standard specified a minimum of 384, then this would be a valuable addition.
Can anyone confirm that the different standards differentiate on dimming levels/types?