Migrated from https://github.com/w3c/csswg-drafts/pull/3129
Right now, Values and Units says:
Equal to the used advance measure of the â0â (ZERO, U+0030) glyph in the font used to render it.
Itâs pretty unfortunate that ch units can cause fonts to download.
WebKit doesnât do this; it just uses the primary font, and if the primary font doesnât support â0â then it uses that fontâs .notdef glyph. No downloads necessary.
https://trac.webkit.org/browser/webkit/trunk/Source/WebCore/platform/graphics/Font.cpp#L123
https://trac.webkit.org/browser/webkit/trunk/Source/WebCore/css/CSSPrimitiveValue.cpp#L661
In this comment, @emilio says that Firefox does something similar.
Yeah, it's not intentional that ch should cause the "0"-containing subface of a composite face to be downloaded. I agree we should clarify that you can use the "not-defined" glyph if "0" isn't present.
(Is ".notdef glyph" a widespread concept we can refer to?)
.notdef is well-understood, well-defined, and common. (It's just glyphid 0 in every font, commonly rendered as the tofu)
I'd propose making the spec somewhat flexible here, as I can think of a few reasonable behaviors off the top of my head, and we probably shouldn't prescribe any until we can do more research:
We can make the spec say something like "if the primary font supports the '0' character, UAs must use the width of that character, otherwise, UAs should use an approximation of it using metrics from the primary font or a fallback font" and we can link to css-fonts for a definition of "supports." Given that almost all fonts support the '0' character, this is probably strong enough to be interoperable but flexible enough that we can find the best fallback.
Per the links in #3129, looks like WebKit is using (1) and Gecko is using (2). So yeah, if it's not clear one is superior to the other making the spec a bit flexible makes sense.
Blink does (1) too.
Is "
.notdefglyph" a widespread concept we can refer to?
.notdef is well-understood, well-defined, and common. (It's just glyphid 0 in every font, commonly rendered as the tofu)
.notdef is defined in Recommendations for OpenType Fonts.
Today I discovered that a âchâ unit in Font Awesome Pro (a popular icon font) is zero width in Safari. So, I think the .notdef glyph is not a reliable thing to fall back on. Average width doesnât seem right either, as icons in an icon font tend to be much wider than a zero would be. Maybe something like 0.4em? Or use the width of an actual space character as a fallback first, if it is available?
Frankly, I donât see why it canât just use the next installed font from the font list (skipping past fonts that arenât installed, or that would require a download).
We shouldn't be designing around the requirements of icon fonts. People shouldn't be using them in the first place.
People do use them. Thatâs the world we live in, and need to adapt to.
Also, I donât think this is unique to icon fonts. I suspect that the one Iâm connecting to (which I donât have total control over), has been subsetted to save bandwidth, to remove characters that are not actually being used in content.
In those conditions, why not just...
use the next installed font from the font list (skipping past fonts that arenât installed, or that would require a download).
It seems a perfectly reasonable thing to do, as it will always eventually find an already loaded (possibly system) font that has a zero in it.
I agree there are some such fonts. I've seen them before too. I don't think ch needs to take care of such fonts.
I'd strongly say that calculating a ch unit (or ex, ic, etc.) _shouldn't_ cause a font to download, but I'm not sure that should mean giving up the use of the 0 as the reference glyph.
Consider the case of a custom font with just a few special characters (e.g., fancy ampersand) as the first font in the list, with a regular web font or system font as the next font in the stack. Using an artificially calculated value from the first font seems a poor choice if there is a font available (already downloaded or on the system) with a 0.
In most CSS environments, there will be at least a default font with a 0 in it. And if there really isn't, the spec already defines a fallback (0.5em).
Finding a system fallback font is not a cheap operation; it requires an RPC call in Blink due to sandbox, and the style recalc pauses during the RPC. I'm not very happy to bake it into the spec when no browsers do it today.
@kojiishi Is it a problem only for system fallback, or also for fallback through the list of specified fonts? E.g. if I have font-family: web font, local font, sans-serif, is it a problem to check âlocal fontâ for '0'?
Mainly, I don't understand why there is any issue. The worries seem to be
ch is defined to use the actual font that is used for rendering. The two things noted above would only occur if:
the font actually used has no glyph for "0", AND
the font actually used has no .notdef (in this case, it would be a faulty font, or the font subsetter was faulty)
Also, falling back to another font for the width of "0" or .notdef misses the point of using the actual font.
So my preference would be an ordered list of things to try:
1) the width of the "0" glyph. if it doesn't have one,
2) the width of the .notdef glyph. if it doesn't have one (and thus, the font is broken),
3) use 0.5em
No additional webfont is ever downloaded, no additional local font is loaded either.
@svgeesus That sounds like a good start, although your wording doesn't acknowledge that there may be 2 or more active fonts (meaning loaded & being used in the document) applying for the element. Would you would search for first the 0 and then the .notdef glyph in this filtered font stack of active fonts, or only use the first active font?
PS, has anyone looked into the other font-relative units & confirmed whether browsers are matching spec as far as the "first available font" being the first one with a space character available?
IIRC @florian had committed some tests to WPT checking for first available font with space.
I donât understand why a .notdef glyph is considered to be an acceptable fallback. It it expected to be the same width as a zero? Isnât it commonly a square shape instead? Mightnât it also be zero width or just a single stroke wide?
If checking the font-family list is expensive, then how about just getting the systemâs default serif or sanserif proportional font?
I donât understand why a .notdef glyph is considered to be an acceptable fallback
Because all fonts are required to have one.
Isnât it commonly a square shape instead?
No, it is typically a rectangle (and, I think, typically uses the default advance width)
It is recommended that the shape of the .notdef glyph be either an empty rectangle, a rectangle with a question mark inside of it, or a rectangle with an âXâ. Creative shapes, like swirls or other symbols, may not be recognized by users as indicating that a glyph is missing from the font and is not being displayed at that location.
your wording doesn't acknowledge that there may be 2 or more active fonts (meaning loaded & being used in the document) applying for the element
Well spotted and yes, it doesn't. Specs referring to "the font used to render" need to be tightened up.
Most helpful comment
Finding a system fallback font is not a cheap operation; it requires an RPC call in Blink due to sandbox, and the style recalc pauses during the RPC. I'm not very happy to bake it into the spec when no browsers do it today.