PHP: loading gzipped javascript files

It is a good idea to combine a set of 20-30 .js files into one large file, compress this file with gzip, save it as something like somebigjsfile.js.gz, and then load it like this: <script type="text/javascript" src="somebigjsfile.js.gz"></script>

This file will be generated again if at least one of the .js files has been modified (checking this with php filemtime).

Also, if relevant, this is for a public application.

+4
source share
3 answers

I assume that you are trying to save the overhead on the server so that the gzip javascript package on every request? If this is the intention, this is not the right way to achieve this. You need to indicate that the file is being transferred with gzip compression in the header, as such:

 HTTP/1.1 200 OK Date: Thu, 04 Dec 2003 16:15:12 GMT Server: Apache/2.0 Vary: Accept-Encoding Content-Encoding: gzip Cache-Control: max-age=300 Expires: Thu, 04 Dec 2003 16:20:12 GMT X-Guru: basic-knowledge=0, general-knowledge=0.2, complete-omnipotence=0.99 Content-Length: 1533 Content-Type: text/html; charset=ISO-8859-1 

Content-Encoding: gzip notification Content-Encoding: gzip

In any case, concatenating and compressing your javascript is always a good idea if you do it right. I would also recommend using some form of JS minification before compression, as this will improve you after compression.

+5
source

I don’t think it’s such a good idea if you plan to return visitors - and I assume that it is if you are developing a web application. Since every time one of your js files changes, you get your users to upload a huge file.

When your user first enters your application, they often quite forgive if the load time is a little longer. If you show them a good introduction, they will read and / or watch something while scripts and various assets are loading. But if they have to wait for your huge script to load every time they return, because that's all or nothing, this will be considered bad UX.

If you don’t need all the 20-30 files first, when the user loads your application, use the script loader to load them in the background.

If you need these 20-30 files, try collecting them into 10 or so files, trying to aggregate those that are most likely to be updated together.

As for gzip compression, your web server should handle this.

+1
source

Yes, that's a good idea.

What I am doing is using a custom php function like load_script($filename) , which adds this file to the “download queue”. And at the end of the page I insert another generate_js_script() function that parses these scripts in the queue, generates a large js file (I think I could run the minimizer at this point) and write it to disk with a unique file name. The function simply generates a <script> tag so that the browser knows to load this "large" js file. This way, I'm not worried about customized header() calls and other strange things that in the past caused problems with some browsers and their caching algorithm.

That way, I could just use the .htaccess directive to tell the server that it is serving all compressed js files (if the browser supports it), so I don't have to do this in php. The trick of this type of caching is not to re-generate too often. I use some kind of hashing to take care of this.

+1
source

Source: https://habr.com/ru/post/1344890/


All Articles