The definition of sRGB, which is also given in css-color-4 says that the white luminance level is 80 cd/m^2.
In practice the level is typically higher, often significantly higher, like the 160 cd/m^2 used by Adobe RGB (1998).
This impacts the black level, which is raised due to flare, when black point compensation is in use.
This also affects compositing SDR content (such as most web pages) onto HDR content (video, images), which is often done for information overlays and mixed HDR/SDR content in general. The result looks very dull if 80 cd/m^2 is used.
Is there a more modern, reference-able recommendation to use a higher white level for sRGB?
Related: SDR and HDR compositing
The sRGB spec was designed for someone in a very dim office, using decades-old hardware. Modern devices get used in bright rooms, dim rooms, completely dark rooms, outside under moonlight, outside in direct sun, ...
Screen brightness varies dramatically from one device to another, sometimes set to be about the same as a sheet of paper under room lighting; other times set to be much brighter than anything else nearby; other times quite dim relative to outdoor light.
Ideally authors would figure out exactly what the viewing conditions were, and tailor their content to each class of viewers.
Realistically though, it’s hard to pick any kind of sane default, given the wide range of common conditions.
You should try asking some real color scientists for advice about this one. And maybe do some sociological research about device users.
You should try asking some real color scientists for advice about this one.
You mean like @svgeesus?
The sRGB spec was designed for someone in a very dim office, using decades-old hardware.
Well, no. It started with the HDTV standard, ITU-R BT.709, which is designed for a very dim (dark) viewing condition; and then used the same primaries but a slightly different transfer function and increased viewing flare for a typical office environment.
I know, I discussed the expected viewing environment with the authors of sRGB when they suggested it at the W3C Print workshop in 1996.
We are fortunate to have obtained in April 1990 unanimous worldwide agreement on a calibrated nonlinear RGB space for HDTV production and program exchange: Rec. ITU-R BT.709. This recommendation specifies the encoding of real world scene tristimulus values into a standard monitor RGB color space assuming a dark viewing condition. HP and Microsoft suggest using these parameters as the basis for the sRGB color space but with a dim viewing condition which is closer to most typical viewing environments for computer displayed imagery
https://www.w3.org/Graphics/Color/sRGB.html
However, you miss the point of this issue, which is primarily compositing SDR content (like sRGB web content) onto HDR video as an overlay. Assuming 80 cd/m^2 gives very bad results, as engineers from, for example, Netflix or the BBC have frequently pointed out.
Okay fair enough. Is there any HDR video getting used in practice on the web?
Do you have a concrete example (say a link) of sRGB content getting composited onto HDR video, and what it looks like?
Does black point compensation get used ever on the web?
Is CSS compositing going to get more complicated/capable than expressed in https://www.w3.org/TR/compositing-1/? Is there a link to that somewhere?
Anyhow, you probably want to assume the same brightness and context for both the HDR and SDR content.
If your HDR content has some extreme specular highlights, you could figure out what brightness is used for a diffuse reflector (e.g. a piece of white paper in the HDR video) and use that as the brightness for your SDR content.
engineers from, for example, Netflix or the BBC have frequently pointed out
Do you have a link?
Okay fair enough. Is there any HDR video getting used in practice on the web?
In-TV browsers and apps are using it, and getting that content onto the open web is an area of active current development. HDR video players which use TTML captions are also doing SDR onto HDR compositing.
Does black point compensation get used ever on the web?
It doesn't (except in WCAG contrast calculations, which assumes a fixed 5% viewing flare), and probably should, especially once color-managed CMYK and other ink profiles get used in Web-to-PDF content.
Anyhow, you probably want to assume the same brightness and context for both the HDR and SDR content.
That would a) be physically impossible, the screen can't display that luminance level on the whole screen, and b) highly undesirable, because of burning out your eyes.
If your HDR content has some extreme specular highlights
that is kind of the point of HDR
you could figure out what brightness is used for a diffuse reflector (e.g. a piece of white paper in the HDR video) and use that as the brightness for your SDR content
what you are describing is called the paper white or, more generally, the media white. And knowing that level is precisely why I opened this issue.
Do you have a link?
I primarily meant that they had pointed this out in in-person discussions. But see for example https://downloads.bbc.co.uk/rd/pubs/papers/HDR/BBC_HDRTV_FAQ.pdf and https://www.w3.org/2017/11/07-colorweb-minutes.html#meanings and https://www.w3.org/2019/09/17-colorweb-minutes.html
Also (in progress) https://w3c.github.io/ColorWeb-CG/#goals
physically impossible, the screen can't display that luminance level on the whole screen
I’m not really clear which display you are talking about. This seems like an entirely display-/context-dependent question.
My main familiarity is with using image processing to display high-dynamic-range scenes on a standard display, and editing photographs such that a display of above-average brightness (e.g. mobile displays of the past decade; but still using an otherwise standard output pipeline) shows "media white" for the scene dimmer than usual, to give myself more headroom for higher brightness/colorfulness in particular areas of the image, but not necessarily having any extreme specular highlights. (For intended display in well lit environments.)
I guess there are now starting to be non-TV displays with explicit “HDR” support? If so, those probably have listed an intended max brightness for “media white”. I would expect those specs to vary widely and to be changing quickly from year to year.
My impression is that ideally the media white in a well lit setting (like a well lit office or outside in the shade) should be set to roughly comparable to a white paper in the same lighting. The film people probably have some guidance for what to do in a very dark theater type setting.
You can probably get a decent default guess guess with something in the 200–500 nits range. But I’d expect for practical use you’d want to always composite SDR with HDR with the expectation that the SDR content uses the media white for the specific display settings at viewing time.
Can that just be declared in the specification?
I’m not really clear which display you are talking about. This seems like an entirely display-/context-dependent question.
I suggest you read up on standards for HDR screens before commenting further. The difference between peak full-screen luminance and peak small-area specular highlight luminance is fairly crucial to understanding how HDR works.
Here’s what Poynton’s thesis says:
The second development is high dynamic range, HDR [Daly 2013]. Conventional HD is approved at a contrast ratio of about 1000:1; diffuse white is portrayed at about 100 nt; and the blackest black is about 0.1 nt. Consumers prefer brighter pictures than those displayed at program creation: Today’s consumer experiences diffuse white at between 300 and 500 nt; black level is typically between 0.3 and 2 nt. For this contrast range, at consumer quality level, eight-bit components coded using a 2.4-power function, as defined in BT.1886, are sufficient. Ten-bit components are used in the studio, and ten bit components would deliver somewhat better performance to consumers than today’s eight bit components.
[...] we expect HDR displays to have gamut approximating that of the DCI P3 standard. Luminance of the portrayal of diffuse white need not be higher than about 500 nt, but we seek to portray specular highlights and directly light sources using luminance levels perhaps ten times higher than diffuse white, a capability unavailable in today’s systems
This is now a few years out of date (such displays are just barely starting to now be available), but seems like a reasonable baseline to me.
From what I can tell searching around both the display hardware and the expected display processing pipeline is changing significantly from year to year, and there are several competing specifications.
I don’t think you’ll be able to get a definitive source until the industry settles down a bit.
But irrespective of how hardware evolves, the appropriate brightness for diffuse white is going to depend substantially on viewing context. What is appropriate for looking at a TV in a dark room is not going to be appropriate for a phone display outside.
Most industry white papers I can find are pretty useless on this question, and it seems like vendors have been more concerned about peak luminance in highlights or short flashes than specifying a target for diffuse white.
This one mentions:
Unfortunately, the Ultra HD Blu-ray standard does not specify a diffuse white luminance level, which some believe would help create more consistent image quality.
For viewing SDR content in the middle of an HDR program on a television, Report ITU-R BT.2390-7 recommends that SDR content first be converted to linear RGB, then scaled so that its peak brightness is comparable to HDR diffuse white, and then have the HLG or PQ inverse EOTF function applied.
A scaling factor of 2.0 is consistent with the HDR level guidance provided in Report ITU-R BT.2408, as that will map the 100 cd/m2 peak white level of SDR to the 200 cd/m2 level suggested for HDR or 58% PQ. Also noteworthy is that MovieLabs has recommended a scaling factor of 2.0 when converting for consumer displays, as MovieLabs has found this to provide a good match to the way such displays show SDR content in their “home cinema” viewing modes [19].
So by this standard 200 nits would seem to be the recommendation for viewing on a TV in dim lighting.
For displaying SDR content,it is common practice for the user to adjust screen brightness as they see fit and in response to (widely varying) viewing conditions. Thus the official 80 cd/m² has no impact on SDR usage.
It matters when SDR content is composited with HDR content that uses a absolute luminance scale (PQ); as far as I can see it does not matter with HDR content which uses a relative scale (HLG). And the important thing is to avoid the following obvious traps, in order of seriousness:
Looking at the Reference Level Guidelines for PQ (BT.2100), from Dolby Laboratories, Aug. 9, 2016:
reference 18% grey card, indoor scenes, 17 cd/m², PQ value of 34%
diffuse white, indoor scenes, 140 cd/m², PQ value of 54%
That seems enough of a recommendation to put in a future CSS Color specification which includes HDR (Rec. BT.2100 PQ, Jzazbz which also uses PQ, etc) ; and to close the issue for CSS Color 4.
Hi Chris @svgeesus and Hi Jacob @jrus
I realize this is closed, but just wanted to mention that part of SAPC is a standardized observer environment, intended to supplant the 80 cd/m² white with something relevant. It's a work in progress, but essentially the idea is to set peak white at five times ambient. I.e. if ambient surround is 32 cd/m², then set white at 160 cd/m². This is obviously in keeping with the "ambient surround should be 20% of peak white".
In practice, per various surveys, people set their displays and devices somewhere between 140 cd/m² and 320+ cd/m².... not even considering high end phones that display over 1200 cd/m². The point being, it's not about mapping to an absolute level as much as mapping to a level appropriate for the display environment.
And also, the IEC standard for sRGB is largely irrelevant in regards to white luminance, as users and automatic brightness adjustment fully dismiss that aspect of the standard.
Setting diffuse white = 5 times ambient surround is going to be roughly comparable (for a typical room or outdoor setting) to setting the diffuse white to the same brightness as a piece of white paper under ambient illumination, which was my recommendation upthread.
Is there a clear spec / description somewhere of exactly what level devices set with automatic brightness adjustment turned on?
Is there a clear spec / description somewhere of exactly what level devices set with automatic brightness adjustment turned on?
Hi Jacob @jrus,
The standard for 20% ambient to white aka 5x ambient is a lot of places as a view condition: ITU, SMTPE, ICC, IEC..
If you don't have a copy your might like the ICDM displays standard,, it's a free download and it's over 500 pages.
https://www.icdm-sid.org/downloads/index.html
It covers everything but I didn't see auto adjustment.... I'll have to dig but I know there have been some research papers out of Samsung and others... but I'm not aware of a specific standard, and considering that devices with automatic brightness also have a luminance level control that is very easy for the user to adjust, and screen technologies with massively different peak white capabilities, not too sure there is much potential for a standard, other than what each manufacturer is doing to out do the other...
Then the question is, is there a useful API... or a not useful one...
https://developer.apple.com/documentation/uikit/uiscreen/1617821-wantssoftwaredimming
A
Most helpful comment
You mean like @svgeesus?