Lighthouse: Third-party Google Assets don't follow their own audit rules, causing report errors

Created on 28 Sep 2018  路  8Comments  路  Source: GoogleChrome/lighthouse

When I run lighthouse reports, my only errors and warnings come from assets delivered by the creator of lighthouse: Google.

Specifically, this warning/error shows up.

Uses inefficient cache policy on static assets

All the items in this report come from

  • securepubads.g.doubleclick.net (google property).
  • googletagmanager.com (google property)
  • pagead2.googlesyndication.com (google property)
  • www.google-analytics.com (google property)
  • tpc.googlesyndication.com (google property)

Why does Google serve assets that fail to follow their own recommendations as based on lighthouse? If they recommend static assets be sent with proper caching headers, why doesn't google follow through with that? Some of their assets have a ttl of 0 or 15m!

If they aren't going to follow their own recommendations, then it seems that lighthouse should exclude their own domains from this analysis, since it creates a bunch of false positives which I can do nothing about.

P1.5 bug

Most helpful comment

Yes, allowing us to filter them out would be helpful. I'm wondering, though, if there is a way to open an issue with Google regarding how they cache their JS. I've never understood why some of their libraries are set to ttl 0 or 15 minutes, etc... I doubt they are releasing code updates every fifteen minutes :). It just seems since lighthouse was affiliated with Google development, that maybe they could nudge Google to observe its own rules and recommendations.

Thank you for your response @patrickhulce . I suppose filtering will not affect the score, which will remain affected by Google's caching errors, but at least we could then reduce noise during our weekly testing.

All 8 comments

Thanks for filing @apotek! Our thoughts on how to approach this 3rd party problem are mostly outlined in #4516. Basically: we want you to be able to filter these out too :)

Yes, allowing us to filter them out would be helpful. I'm wondering, though, if there is a way to open an issue with Google regarding how they cache their JS. I've never understood why some of their libraries are set to ttl 0 or 15 minutes, etc... I doubt they are releasing code updates every fifteen minutes :). It just seems since lighthouse was affiliated with Google development, that maybe they could nudge Google to observe its own rules and recommendations.

Thank you for your response @patrickhulce . I suppose filtering will not affect the score, which will remain affected by Google's caching errors, but at least we could then reduce noise during our weekly testing.

Hi @apotek!

I can't speak on behalf of most of those other products you listed, but I have used Google tag manager (GTM) quite a bit. GTM enables webmasters to inject their own scripts into the page based off a number of different triggers. GTM users value being able to update these scripts independently from release cycles of their main production code. Because of this use case, assets from GTM can't be given a low ttl - if they had a high ttl, webmasters would have to wait longer for their changes to take affect. Although the warning message LH provides labels these as static assets, they're really quite dynamic. I'd imagine the same is true of the other products you listed.

Anyhow, #6351 was opened a couple days ago, so you'll be able to filter out 3P assets soon.

@Hoten great insight here! I'm curious if you might be able to shed light on why an asset that is so dynamic in nature shouldn't have a no-cache/must-revalidate cache policy?

Such a resource would still be able to benefit from 304 responses and the need for fresh assets is clearly communicated. @paulirish and I had a hard time coming up with the rationale for doing anything other in between immutable and must-revalidate 馃槃

I could only guess, but my first assumption is that GTM attempts to be as lightweight as possible, so they really don't want no-cache or must-validate due to the extra time fetching / validating.

This SO post corroborates the discussion so far.

@patrickhulce wdyt about repressing this warning if the cache control is private? Private's only use case AFAIK is to serve user-specific content, which implies the asset is not static.

@Hoten good idea!

While it's certainly possible for user-specific content to be long-lived, it's certainly a good enough indicator for us to not complain. Any usage of no-cache, must-revalidate, or private sounds like a pretty reasonable list of things to not cache forever.

Let's add private to our list.
And land the 3P filter.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Simply007 picture Simply007  路  3Comments

radum picture radum  路  3Comments

workjalexanderfox picture workjalexanderfox  路  3Comments

sanprieto picture sanprieto  路  3Comments

motiejuss picture motiejuss  路  3Comments