I tried to find a way to compress files and version them but couldn't find a way to do this.
Basically, I want to do something like this,
mix.compress(['path/to/file1.css', 'path/to/file2.js'], 'destination/path/');
Result:
'destination/path/file1.css.gz'
'destination/path/file2.js.gz'
Could someone please point me in the right direction? Thanks!
PS Thank you so much for your awesome work @JeffreyWay
@mrajabtech
Most modern web servers has ability to serve gzip content on the fly.
Apache has mod_gzip and mod_deflate while nginx can also do that.
We should not serve *.gz content unless browser request for it, and web-servers can take care of that automatically.
Is there any other good reason that Laravel Mix should generate *.gz files ?
Thanks for getting back @ankurk91
It is something I'd like to have it inbuilt in this awesome package. Also, lets say all users of my web application use modern browsers and they all accept gzipped content. In this case, I do not want to utilize Apache's bandwidth by enabling mod_gzipor mod_deflate.
Yeah I think for now it's probably best to let your server do that for you.
Nginx has gzip_static which automatically serves .gz versions of a file if available, which saves CPU resources.
@ankurk91 Reason is simple: if you have 1000 requests to *.js file with gzip enabled, it will be compressed by nginx 1000 times. Same file with exactly same output. if we have *.js.gz file and gzip_static enabled, nginx will just output that file to user. It not only saves CPU (and you can rent less powerful servers and pay less), but also significantly reduces TTFB (Time to first byte) - user doesn't wait for compression to complete.
I used this:
npm install --save compression-webpack-plugin
webpack.mix.js
const CompressionPlugin = require('compression-webpack-plugin');
...
plugins: [
new LiveReloadPlugin(),
new webpack.ProvidePlugin({
jQuery: 'jquery',
}),
new CompressionPlugin({
asset: "[path].gz[query]",
algorithm: "gzip",
test: /\.js$|\.css$|\.html$|\.svg$/,
threshold: 10240,
minRatio: 0.8
})
],
});
mix.js('resources/assets/js/app.js', 'public/assets/js')
.sass('resources/assets/sass/app.scss', 'public/assets/styles')
.copyDirectory('resources/assets/img', 'public/assets/img')
.copyDirectory('resources/assets/fonts', 'public/assets/fonts')
and it works well for .js & .css files, but it ignores, for example, .svg files, copied from _resources_ to _public_ folder. Any Idea, how to gzip them?
You should only serve .gz files if client sends "Accept" header.
More advanced project setup will have a dedicated CDN to do that.
You should only serve .gz files if client sends "Accept" header.
That's exactly how it works.
More advanced project setup will have a dedicated CDN to do that.
If we have a small project, CDN is excessive. Ex. we have dockerized versions for most of our projects without CDN and it works perfect. But if we can significantly optimize performance in a few lines - just saying _we just minfied our JS, now let's gzip it_, why not?
My JS skills are novice at best, but since I also posted the referred issue I can try to work on a PR for this if it would be acceptable to merge.
@JeffreyWay ?
Most helpful comment
@ankurk91 Reason is simple: if you have 1000 requests to *.js file with gzip enabled, it will be compressed by nginx 1000 times. Same file with exactly same output. if we have *.js.gz file and gzip_static enabled, nginx will just output that file to user. It not only saves CPU (and you can rent less powerful servers and pay less), but also significantly reduces TTFB (Time to first byte) - user doesn't wait for compression to complete.
I used this:
webpack.mix.js
and it works well for .js & .css files, but it ignores, for example, .svg files, copied from _resources_ to _public_ folder. Any Idea, how to gzip them?