I don't want to use a library which gzips on the fly, because of the overhead.
The website has some dynamic components which is implemented in node.js.
I have some static js and css files as well as their gzipped counterparts. I want to serve the gzipped version only to browsers which support it.
I considered using the static middleware in express to serve the static files, along with some URL rewriting middleware to conditionally serve the gzipped files. However, I cannot find any conditional rewrite module.
I cannot believe that no one has done this, or that it needs so many work arounds. What am I missing?
On a different note, is serving up static files via node.js too expensive? On the other hand, using Apache for static files and running node.js behind seems bad as well. What is the least stupid configuration for AWS EC2 hosting?
This code is working with Express 4
Make sure you are setting the right content type - mime-types reference - for the compressed file like this:
or it will not work
If you want to have best performance for your clients, just use a CDN. It will take care of gzipping for you and a lot of other stuff. If you need help you can use express-cdn module.
If you don't like CDNs for some reasons, your best bet is using nginx. I see it tagged in your question, but you didn't mention anything about it. nginx is way faster than node.js. For nginx you can check it's gzip-static module.
If you still want to use node.js, then connect-gzip-static is your best bet. It works almost same as nginx-gzip-static module. If there's a
.gz
file counterpart of the requested file, then it will use that, else it will fallback to normal connect-static module.Don't forget to compile files beforehand, if you are using gulp, then you might also use gulp-gzip. If not just use gzip command.
There's also gzippo, it gzips on the fly, but it also caches the result in the memory. So it will be only gzipped once.