I noticed the files in the public/
output folder aren't gzipped. They should be by default, shouldn't they?
Or is there a deployment trick I'm missing?
Most web servers gzip files on the file for you. This certainly could be
done if you'd like post-build.
On Thu, Aug 11, 2016 at 6:29 PM Kai Hendry [email protected] wrote:
I noticed the files in the public/ output folder aren't gzipped. They
should be by default, shouldn't they?Or is there a deployment trick I'm missing?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/gatsbyjs/gatsby/issues/387, or mute the thread
https://github.com/notifications/unsubscribe-auth/AAEVh5ksBOkdIHLaAgneKIb8k5GTvNVNks5qe8xkgaJpZM4JitGp
.
I think it's silly to put this (dynamic) task on the server when should be statically gzipped to begin with.
I'm not familiar with the JS toolchain. How doesn't one do a gzip post-build?
Gatsby has a hook for postBuild that allows you to do pretty much anything with the files after their built. Here's a quick example that will gzip all html, css, and js files to their .gz counterparts postBuild.
// in gatsby-node.js
import fs from 'fs' // native
import zlib from 'zlib' // native
import path from 'path' // native
import glob from 'glob' // https://www.npmjs.com/package/glob
export function postBuild(pages, callback) {
const publicPath = path.join(__dirname, 'public')
const gzippable = glob.sync(`${publicPath}/**/?(*.html|*.js|*.css)`)
gzippable.forEach(file => {
const content = fs.readFileSync(file)
const zipped = zlib.gzipSync(content)
fs.writeFileSync(`${file}.gz`, zipped)
})
callback()
}
That said I would still leave gzipping to the webserver / CDN.
Why would you leave gzipping to server? Increase latency? For it to use something inferior to gzip --best
? Seriously would love to know! :laughing:
Btw, you shouldn't add the .gz content. It should just be compressed and typically --content-encoding gzip
needs to added to dumb servers if they can't detect it's gzipped.
We only have nginx running in front of a bunch of Rails APIs in Docker containers at this point and I'm just guessing that nginx is faster at gzipping than any ruby runtime.
For our static assets, we just check the compress objects automagically checkbox on CloudFront and let it do its thing. I figure Amazon has got the fastest way possible figured out by now.
Surely the "fastest way" is to do it in advance as a build step? It's pointless to put this task on the server unless you're lazy.
I'm perfectly happy to be lazy left and right as long as someone else picks up the slack :-)
I've never seen another system like Gatsby that defaults to gzipping output. It seems better to just let users choose. Once we have a plugin system, you could create a plugin that gzips everything by default and share it with the community. In the meantime, @benstepp's code example is a great solution. Closing this.
Here is the code of @benstepp ported new gatsby hooks, if someone needs it
// in gatsby-node.js
const fs = require('fs');
const path = require('path');
const zlib = require('zlib');
const iltorb = require('iltorb');
const glob = require('glob');
exports.onPostBuild = () =>
new Promise((resolve, reject) => {
try {
const publicPath = path.join(__dirname, 'public');
const gzippable = glob.sync(`${publicPath}/**/?(*.html|*.js|*.css|*.svg)`);
gzippable.forEach((file) => {
const content = fs.readFileSync(file);
const zipped = zlib.gzipSync(content);
fs.writeFileSync(`${file}.gz`, zipped);
const brotlied = iltorb.compressSync(content);
fs.writeFileSync(`${file}.br`, brotlied);
});
} catch (e) {
reject(new Error('onPostBuild: Could not compress the files'));
}
resolve();
});
const fs = require('fs');
to
const fs = require('fs'); const path = require('path');
@benstepp @abumalick Just curious, why rely on this custom code when you could add plugins to the webpack config to handle it? Is there something about the way gatsby builds that makes webpack unreliable for compressing files?
there is more than one webpack configuration in gatsby, it is difficult to imagine that a simple webpack plugin will do that job correctly. I am interested by any feedback if you can make it work.
Just as a note we were having problems with (https://github.com/gatsbyjs/gatsby/issues/387#issuecomment-436672441) and the latest versions of gatsby where, for some reason, 404.html was added a folder, so we modified glob.sync()
with { nodir: true }
.
Just in case anyone else has this issue:
const options = { nodir: true };
const gzippable = glob.sync(`${publicPath}/**/?(*.html|*.js|*.css|*.svg)`, options);
I'm perfectly happy to be lazy left and right as long as someone else picks up the slack :-)
I've never seen another system like Gatsby that defaults to gzipping output. It seems better to just let users choose. Once we have a plugin system, you could create a plugin that gzips everything by default and share it with the community. In the meantime, @benstepp's code example is a great solution. Closing this.
Hello @KyleAMathews, does it still the preferable way to compress Gatsby built files? Or there is a newer method? It takes around 3 mins to compress my app files (approx: 3k pages).
ok, just realised we don't need it anymore as CloudFront is compressing it for us. 👍
It's better to rely on your CDN to do the compression. Lots of CDNs support brotli which makes things even smaller than gzip. So it's probably not worth to gzip it before uploading.
Most helpful comment
Gatsby has a hook for postBuild that allows you to do pretty much anything with the files after their built. Here's a quick example that will gzip all html, css, and js files to their .gz counterparts postBuild.
That said I would still leave gzipping to the webserver / CDN.