Since the introduction of Shadow DOM, we've been struggling with what to do with name-defining things, like @font-face, which define global names for other things to reference.
Right now the answer is a collective shrug. I think I have an answer, however:
@font-face
) is valid inside of shadow trees, and is scoped to the TreeScope that its defining stylesheet is in. Nested tree scopes can refer to the names/scopes explicitly, but won't see them by default.font-family
font name) is implicitly a tuple of (name, defining scope), where the defining scope is the TreeScope the reference's stylesheet is in. (In other words, it's always a reference to the local thing defining the name, not something further up the scope tree.)scoped(val, <integer> | global)
function, where the <integer>
says how many TreeScopes upward you should walk before resolving the name.font-family: foo;
, when it inherits into a shadow tree, if you call getComputedStyle()
on a shadow element you'll get scoped(foo, 1)
back, because the defining scope is one scope up in the tree.This has some implications. Since the defining scope is implicitly captured by a reference, it doesn't change as the value inherits. Thus, in this situation:
<style>
@font-face { font-family: foo; ... }
body { font-family: foo; }
x-component::part(my-p) { font-family: foo; }
</style>
<body>
<p>ONE
<my-component>
<::shadow>
<style>
@font-face { font-family: foo; ... }
p.foo { font-family: foo; }
p.bar { font-family: scoped(foo, 1); }
</style>
<p>TWO
<p class=foo>THREE
<p class=bar>FOUR
<p part=my-p>FIVE
</>
</>
</>
Scripting is a slightly thornier problem here. When setting styles, we can use the rules I've already laid out - you're always setting the style in some stylesheet (perhaps the implicit one attached to an element and accessed via el.style
), so there's a consistent notion of an associated TreeScope. (This may not always be obvious, but it's clear - a script that pokes around inside of the shadows of its components and sets styles needs to be aware of what scope the stylesheet is in and what scope the name-defining thing it's trying to reference is in.)
Ojan privately points out that it's very likely the integer is over-engineering, and virtually every case will just be wanting the nearest-local or the global version of a name. You still need the ability for a value to refer to an arbitrary scope for inheritance to work properly, but you don't actually need to be able to specify that reference - in other words, we don't really need syntax for such a thing.
This simplifies it considerably. We can just define that the keyword always refers to the globally-defined name, and have scoped(foo)
refer to the local name (name tbd, of course). Internally, the value will still be stored as a (name, scope) tuple, but in some cases, from some stylesheets, you won't be able to write down a value that actually refers to the specified name. The TypedOM will be completely correct, however - it can still have a CSSScopedKeywordValue
with a .scope
property that refers to a particular TreeScope, and you can construct that explicitly if you need to refer to a particular scope's name. You just won't be able to, in some cases, reproduce the effects of a style via an explicit property set in a stylesheet or the string-OM.
Alternate argument in favor of the integer: it means the value is locally interpretable at all times. If we make the bare keyword (like font-family: foo;
) always refer to the global scope, and require using scoped()
to refer to a local one, then we can just have a special inheritance rule, where when a scoped()
value inherits past a shadow boundary, its integer automatically increments. This maintains the "el.style.foo = getComputedStyle(el).foo is a no-op" invariant, and means we don't necessarily have to track a JS object in the internal value, just an integer.
Common usage will still be limited to just foo
and scoped(foo)
, and we can even warn authors to not use scoped(foo, N)
as it's fragile, but it would be supported for back-end reasons.
I would argue that the expected behaviour is for the bare reference (font-family: foo
) to always refer to the local scope by default. My argument is that I feel CSS behaviour should be the same in a scoped environment and a global environment; it would be weird that it takes a different syntax to define a font and use it depending on the scope it is defined in. Under the proposal by @tabatkins the following code would have a different meaning depending on if it was defined in the global scope or inside a ShadowRoot:
@font-face { font-family: foo; ... }
p { font-family: foo; }
The expectation from a developer standpoint is that when using the normal syntax, styles within a ShadowRoot behave like they are in their own global scope, and do not interact with outside styles. There are some cases where alternative syntax is needed to resolve scoping issues (like the ::slotted()
syntax), but the default syntax should always have the expected behaviour of treating the local scope as if it was the global one. (i.e. p { ... }
should apply to <p>
elements in the ShadowRoot, not elements slotted through <slot>
)
I am suggesting this behaviour:
font-family: foo
always refers to the local scope.font-family: scoped(foo, <integer> | global)
The difference between @tabatkins' suggested behaviour is that the bare reference font-family: foo
is a shorthand for font-family: scoped(foo, 0)
(instead of defaulting to global)
I chose the current behavior for bare keywords to match what I thought I remembered browser behavior was, or at least Chrome's. Per #715, tho, it looks like Safari does the opposite, and treats @keyframes as local and references to it as referring to the local definition?
If we have to swap the bare keyword to be the local version, that's fine.
(I do think, tho, that the "bare keyword means global" behavior is slightly better aesthetically, as it means that the common case from today doesn't have to invoke the magical "rewrite yourself to refer to the parent scope" behavior; you only get that if you explicitly use scoped()
.)
I think we are actually seeing a different issue here. All browsers with Web Component support use local definitions by default (including Chrome), at least for @keyframes definitions (I will check @font-face behaviours and report back) but there is an inconsistency in how slotted elements are treated.
Chrome uses local definition in the scope where the DOM node the animation is targeting is defined, as per this example:
<my-component>
<div id="one"></div>
<::shadow>
<style>
@keyframes some-animation { ... };
div {
animation-name: some-animation;
}
::slotted(div) {
animation-name: some-animation;
}
</style>
<div id="two"></div>
<slot></slot>
</>
</>
In this case, div#one
is attached inside the lightdom (outside my-component
shadow tree), so Chrome will look for a definition in that scope, fail to find, and do nothing. div#two
is attached inside the shadow tree of my-component
so Chrome will look there for some-animation
and apply the animation correctly.
Safari will apply the animation correctly in both cases, because it uses the scope of the CSS reference (the animation-name: some-animation
).
I feel like the correct scope to use should be the scope where the CSS reference is made, and not the scope where the affected node is attached to the DOM.
A more complete reproduction with comments can be found here: https://codepen.io/ruphin/pen/zPQvXw
Both browsers are in agreement on the scoping rules of @keyframes definitions except for the case of slotted elements styled from within the shadowroot. Chrome renders a red and two green squares, and Safari renders a blue and two green squares.
@ruphin: Anonymous tags </>
are lagal in HTML? looks like self-disclosed tag without name?
It's a pseudo-code example like in the original post. A working example with valid syntax can be found in the codepen.
@ruphin Ah, cool, thanks for the compat research. I agree then that we should match Safari's behavior and let the bare keyword refer to the local definition. I'll update the OP.
Current chrome behavior is somewhat broken, leaking @keyframe
names from inside the shadow root, fwiw. The following test-case fails on Chrome, for example, showing red:
<!doctype html>
<style>
#host {
width: 100px;
height: 100px;
background: green;
animation: myanim 10s infinite;
}
</style>
<div id="host"></div>
<script>
host.attachShadow({ mode: "open" }).innerHTML = `
<style>
@keyframes myanim {
from { background: red; }
to { background: red; }
}
</style>
`;
</script>
What WebKit does makes sense (it keeps track of the scope the rule that ended up in the declaration). But that feels somewhat like a layering violation, having to propagate the cascade order down so much.
Hmm, the problem with what WebKit does is that it doesn't work if you explicitly inherit the name from an scope you don't have access to, because suddenly it's from an scope you don't have access to. Or worse, it's from a different scope.
I'm having a hard time deciding what to implement in Firefox here. :(
Ugh, apparently Blink's behavior is pretty intentional judging their document lookup:
Blink doesn't make animations in ::slotted
selectors work (:host
works basically by chance), but for now I'm interested in not making Firefox's implementation prone to compat problems, so given it's simpler, I'll do what Blink does.
I consider the Blink implementation to be broken, and I would urge you to reconsider your position and follow the Safari implementation instead.
From a developer point of view, the Blink implementation breaks the Web Components contract of code encapsulation. With their implementation, it is impossible to style a slotted component without breaking the encapsulation of the component and injecting style into the global document.
With that I mean that in specific cases, as a Web Component author, with the Blink implementation, you must literally inject a
Most helpful comment
The reason it's surprising is: how is a component author, writing a reusable component meant to be included on a number of webpages written by other people, supposed to know that the @font-face name they chose collides with the @font-face that one of those other-website people chose?
And on the other side, how is someone writing a website supposed to know that the font name they're using happens to also be used by one of the components on their page, written by someone else?
The collision here is fully accidental on both people's parts, and can't be avoided except by naming your fonts with high-entropy random strings. That's clearly silly. ^_^
They didn't change the definition of font foo, tho. They added a brand new font foo within their shadow tree, and then inherited a completely unrelated font foo from their ancestor tree. The fact that the two have the same name is completely unintentional.