Copy Hg repo with all the big files

We have a big old repository with large files. I want to replicate the repository to the backup server using a cron script that runs hg pull . However, this command does not extract large files.

I currently have 2 GB of history, but I am missing 6 GB of large files. How can I get hg to delete these important files?

+6
source share
1 answer

By default, only large files will be downloaded for the updated version.

'hg help largefiles' says:

 When you pull a changeset that affects largefiles from a remote repository, the largefiles for the changeset will by default not be pulled down. However, when you update to such a revision, any largefiles needed by that revision are downloaded and cached (if they have never been downloaded before). One way to pull largefiles when pulling is thus to use --update, which will update your working copy to the latest pulled revision (and thereby downloading any new largefiles). If you want to pull largefiles you don't need for update yet, then you can use pull with the "--lfrev" option or the "hg lfpull" command. 

For this purpose you should use 'hg lfpull --rev "all ()"'.

+10
source

Source: https://habr.com/ru/post/974319/


All Articles