With something like file-loader it's possible to generate file names to help with cache invalidation upon change e.g. forcing user to download new assets (css, images, etc).
In light of #638 what are some of the approaches to solving this problem in Next.js?
I don't fully understand your question. Can you elaborate ?
So I need a strategy for managing invalidation of cached assets and I want to avoid doing this manually by, for example, adding a query param onto the end of any referenced image in my components:
/static/image.jpg?v=1
Instead I have previously been using Webpack file-loader so I can load the image into my component and have file-loader generate a new hash filename every time its changed via the client side js bundle:
/static/0dcbbaa701328a3c262cfd45869e351f.jpg
Perhaps I need to implement something like this with a customer server? https://www.npmjs.com/package/webpack-assets-manifest
I also regraded assets hashing as an essential step of nowadays build process.
Instead of just taking advantages of conditional request (304) with ETag and Last-Modified, Cache-Control: max-age make browser response from cache directly which is a huge perf boost. However we need to hash our static assets so we can invalidate cache when we update out assets.
Take a look at your rivals https://nuxtjs.org/guide/assets, which has had this kinda functionality built-in.
IMO whether or not use webpack is just a implementation details. Developers can use gulp-rev to do it themselves but It's way better to have this feature built-in.
Got to say next.js have handled JS/CSS hashing well and It solve 99% cases, while assets such as images/fonts could cause some UI problems
Might be related: #672
See https://github.com/zeit/next.js/pull/745. Feel free to re-open if this doesn't solve your issue.
@Huxpro @davidrossjones I too consider hashing and never serving the wrong version of a file to a user of the website to be of paramount importance. I do the following with nextjs:
in my .babelrc
[
"babel-plugin-file-loader",
{
"extensions": ["otf", "png", "jpg", "svg"],
"publicPath": "/static",
"outputPath": "/static",
"name": "[path][name].[ext]/[hash].[ext]",
"context": ""
}
],
now, all of my static assets go into the static folder and have a hash. It's not exactly pretty, but it works.
Then, I upload the static folder to S3 and set the cache-control headers to immutable. There are no conflicts, obviously, because of the hash, and users of previous versions of the site can continue to get the proper assets. Then, I have a cloudfront distro that compresses the files and terminates ssl connections, and a route53 DNS record in front of that.
Similarly, I upload the html files (from out/builds) to S3 with cache-control max-age=0,no-cache. To deploy a version of the site, I run
aws s3 sync \
s3://$S3_BUCKET/$S3_BUCKET_FOLDER/builds/$GITHASH \
s3://$S3_BUCKET/$S3_BUCKET_FOLDER/current \
--delete \
--cache-control max-age=0,no-cache \
--acl public-read
and use a separate behavior in cloudfront to serve /current from root.
That's super abbreviated, but feel free to ping me for more details, and I'll put up a boilerplate or something. :)
Most helpful comment
I also regraded assets hashing as an essential step of nowadays build process.
Instead of just taking advantages of conditional request (304) with
ETagandLast-Modified,Cache-Control: max-agemake browser response from cache directly which is a huge perf boost. However we need to hash our static assets so we can invalidate cache when we update out assets.Take a look at your rivals https://nuxtjs.org/guide/assets, which has had this kinda functionality built-in.
IMO whether or not use webpack is just a implementation details. Developers can use
gulp-revto do it themselves but It's way better to have this feature built-in.