Another alternative would be to redirect everything to a PHP script that gzips and caches everything on the fly. For each request, it will compare timestamps with the cached version and return this if it is newer than the original file. With PHP, you can also rewrite HTTP headers, so it is handled properly, as if it were GZIPed by Apache itself.
Something like this might do the job for you:
.htaccess
RewriteEngine On RewriteRule ^(css/styles.css)$ cache.php?file=$1 [L]
cache.php:
<?php // Convert path to a local file path (may need to be tweaked) cache($_GET['file']); // Return cached or raw file (autodetect) function cache($file) { // Regenerate cache if the source file is newer if (!is_file($file.'.gz') or filemtime($file.'.gz') < filemtime($file)) { write_cache($file); } // If the client supports GZIP, send compressed data if (!empty($_SERVER['HTTP_ACCEPT_ENCODING']) and strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== false) { header('Content-Encoding: gzip'); readfile($file.'.gz'); } else { // Fallback to static file readfile($file); } exit; } // Saved GZIPed version of the file function write_cache($file) { copy($file, 'compress.zlib://'.$file.'.gz'); }
To create cached versions, you need write permissions for apache. You can modify the script a bit to store cached files elsewhere.
This has not been thoroughly tested and may need to be changed a little for your needs, but the idea is all there and should be enough to get you started.
source share