It is not possible to compress objects without loading and reloading them.
However, you can use gsutil for this, and if you run it from the Google Compute Engine (GCE) virtual machine, you will only pay for the number of operations, not the bandwidth.
In addition, regarding the setting of the content encoding header using setmeta , you are correct in your interpretation of what happened. You set the metadata of the object, indicating that it contains gzip data, but the content does not contain a valid gzip stream, so when you try to download it using Accept-Encoding: gzip , the GCS service tries to unzip the stream and does not work.
I would suggest downloading the bucket to a local disk on the GCE virtual machine:
gsutil cp -r gs://bucket /path/to/local/disk
Then use the -z option to specify which file extensions are for gzip:
gsutil cp -z js,css,html -r /path/to/local/disk gs://bucket
source share