Gzip: stdout: file too large when starting the backup script

I created a regular backup script that only supports specific files and folders.

tar -zcf $DIRECTORY/var.www.tar.gz /var/www
tar -zcf $DIRECTORY/development.tar.gz /development
tar -zcf $DIRECTORY/home.tar.gz /home

Now this script runs for about 30 minutes and then gives me the following error

gzip: stdout: File too large

Any other solutions that I can use to back up my files using shell scripts or to solve this problem? I am grateful for any help.

+3
source share
3 answers

A file that is too large is an error message from your libc: the output has exceeded the file size limit of your file system.

So this is not a gzip problem.

Options: use a different file system or use split:

tar czf - www|split -b 1073741824 - www-backup.tar.

creates a backup.

:

cat www-backup.tar.*|gunzip -c |tar xvf -
+10

?

, FAT32 ~ 4 , .

30 , .

+2

Use another compression utility, for example compress or bzip2

0
source

Source: https://habr.com/ru/post/1742485/


All Articles