Audit group: Crawling and indexing
Description: Serves ES5-compatible code to older user agents
Failure description: Does not serve ES5-compatible code to older user agents
Help text: Search engine crawlers may not be using the latest browser versions, so falling back to ES5 ensures the page can render and be indexed properly. Learn more.
Success conditions:
Additional context:
Googlebot uses a web rendering service (WRS) that is based on Chrome 41 (M41). Generally, WRS supports the same web platform features and capabilities that the Chrome version it uses — for a full list refer to chromestatus.com, or use the compare function on caniuse.com.
We may be able to remove the "in production" part from the audit description if we decide that the audit should also run in development environments (localhost) in which the code may not be transpiled. I think that may be a good idea so that developers are not surprised by this audit when code hits production.
Yeaaaa...let's massage the title a bit :) "Avoid ES6" is a strong statement.
This is really about feature detection and making sure that your JS codebase runs across as many targets as possible. Maybe we could structure the title/description around that goal?
Yeah +1 to @ebidel's suggestion. Shipping ES6 to latest browsers (which is how Lighthouse loading your page) can be a huge performance win avoiding wasted (and sometimes wildly incorrect) polyfills and simpler minification. From Phil Walton's great article, avoiding transpilation dropped his bundle size from 175k to 80k
Framing this as feature detection to serve a fallback ES5 for older non-supported browsers seems much better 👍
proposal something like:
Description: Serves ES5-compatible code to older user agents
Failure description: Does not serve ES5-compatible code to older user agents
Framing this as feature detection to serve a fallback ES5 for older non-supported browsers
Just to make sure we are all on the same page here - this is a SEO audit that will show up in the SEO section of the report. We want it because we know that googlebot doesn't currently support ES6. Do we want to make this audit more general and talk about older browsers?
Thanks for the suggestions. Updated the descriptions.
Regarding how to "emulate Chrome 41":
There are some weaknesses. UA spoofing only defeats UA sniffing. Hijacking JS APIs (eg CustomElementRegistry) can be defeated by new window objects.
Alternatively, perhaps we don't bother with any kind of emulation and simply detect whether ES6 was used and warn the user that they need to make sure they've taken the additional step to fall back when unsupported.
WDYT?
Sounds like the goal here is you are not serving ES6-ish code to GoogleBot.
To do this truly right, we need to reload the page with a googlebot UA and look at the scripts. This requires another pass. :(
An alternative we discussed is to look at the scripts served to our emulated browser. And if we detect ES6-ish features that Chrome 41 didn't support, then we supply a warning saying, "We hope you're not serving this same JS to older UAs".
@rviscomi @kdzwinel WDYT?
A potential additional (performance-y) audit is: you are not serving transpiled/polyfilled es5 code to modern chrome. But that's not an SEO thing and we can pursue it separately.
To do this truly right, we need to reload the page with a googlebot UA and look at the scripts. This requires another pass. :(
A page would never know any better to serve different content to a googlebot UA. Even if they did, isn't that cloaking? 😛
An alternative we discussed is to look at the scripts served to our emulated browser. And if we detect ES6-ish features that Chrome 41 didn't support, then we supply a warning saying, "We hope you're not serving this same JS to older UAs".
What if the script itself uses feature detection? eg: if (window.bleedingEdgeNativeFeature) {...} else { fallback() }. If our audit simply looks for bleedingEdgeNativeFeature, it'd fail. Do we have APIs sophisticated enough to know when features are actually used?
A page would never know any better to serve different content to a googlebot UA. Even if they did, isn't that cloaking?
I'm not sure I follow this logic. It's a best practice to detect that the user agent doesn't support the code you're about to use and use a compatible version instead. Sure you could serve a completely different site and it'd be cloaking, but the mechanism itself is common, no?
Sorry I parse "UA" to mean "UA string". If we're just talking about the browsing context, then yes feature detection is totally normal.
Deprioritizing
Closing since at least for Google it shouldn't be needed any more. https://webmasters.googleblog.com/2019/05/the-new-evergreen-googlebot.html
Most helpful comment
Yeaaaa...let's massage the title a bit :) "Avoid ES6" is a strong statement.
This is really about feature detection and making sure that your JS codebase runs across as many targets as possible. Maybe we could structure the title/description around that goal?