Manual alternative to mod_deflate

Say I don’t have mod_deflate compiled into apache, and now I don’t want to recompile. What are the disadvantages for a manual approach, for example. sort of:

AddEncoding x-gzip .gz RewriteCond %{HTTP_ACCEPT_ENCODING} gzip RewriteRule ^/css/styles.css$ /css/styles.css.gz 

(Note: I know that the specifics of this RewriteCond should be changed slightly )

+4
source share
3 answers

Another alternative would be to redirect everything to a PHP script that gzips and caches everything on the fly. For each request, it will compare timestamps with the cached version and return this if it is newer than the original file. With PHP, you can also rewrite HTTP headers, so it is handled properly, as if it were GZIPed by Apache itself.

Something like this might do the job for you:

.htaccess

 RewriteEngine On RewriteRule ^(css/styles.css)$ cache.php?file=$1 [L] 

cache.php:

 <?php // Convert path to a local file path (may need to be tweaked) cache($_GET['file']); // Return cached or raw file (autodetect) function cache($file) { // Regenerate cache if the source file is newer if (!is_file($file.'.gz') or filemtime($file.'.gz') < filemtime($file)) { write_cache($file); } // If the client supports GZIP, send compressed data if (!empty($_SERVER['HTTP_ACCEPT_ENCODING']) and strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false) { header('Content-Encoding: gzip'); readfile($file.'.gz'); } else { // Fallback to static file readfile($file); } exit; } // Saved GZIPed version of the file function write_cache($file) { copy($file, 'compress.zlib://'.$file.'.gz'); } 

To create cached versions, you need write permissions for apache. You can modify the script a bit to store cached files elsewhere.

This has not been thoroughly tested and may need to be changed a little for your needs, but the idea is all there and should be enough to get you started.

+3
source

There is no big difference in performance between manual and automatic approaches. I made several Apache scanners with automatic and manual compression, and both times were within 4% of each other.

The obvious downside is that you will have to manually compress the CSS files before deployment. Another thing that you might want to make very sure is that you have configured the configuration correctly. I could not get wget to automatically decode css when I tried the manual approach, and ab reports also indicated the compressed size of the data instead of the uncompressed ones, as with automatic compression.

+1
source

You can also use mod_ext_filter and pipe stuff via gzip. In fact, this is one example:

  # mod_ext_filter directive to define the external filter
 ExtFilterDefine gzip mode = output cmd = / bin / gzip

 <Location / gzipped>
 # core directive to cause the gzip filter to be
 # run on output
 SetOutputFilter gzip

 # mod_header directive to add
 # "Content-Encoding: gzip" header field
 Header set Content-Encoding gzip
 </Location>

The advantage of this is that it is actually very easy ... The disadvantage is that there will be an additional fork() and exec() on each request, which, obviously, will have a slight impact on performance.

+1
source

Source: https://habr.com/ru/post/1303481/