How can I import a 72GB dump file into git?

I migrated the old cvs repository cvs2git (cvs2svn). The dump file given is now 72 GB, and my attempts to import the dump via git fast import always fail due to an error due to memory:

fatal: Out of memory, malloc failed (tried to allocate 6196691 bytes )
fast import: alarm failure report for fast_import_crash_13097
error: git -fast-import died from signal 11

As a result, my system has 32 GB of RAM and 50 GB. I am running import on Red Hat 5.3 using git 1.8.3.4 (gcc44, python2.6.8, cvs2svn2.4.0). I also tried to limit stack size and file descriptors, but a memory error still exists.

Does anyone have any ideas?

+4
source share
2 answers

The idea is as follows:

Then you import cvs (sub-) repos into separate git repositories.
Since git is distributed and not centralized, you want the size of each git repository to be reasonable.

+3
source

I also ran into the same problem, but now it is resolved. Download the latest version of cvs2svn, which has a fix to significantly reduce the dump size. It reduces metadata to fix the character. Versions - cvs2git version 2.5 or later.

(You can view the change made at https://github.com/mhagger/cvs2svn/commit/fd177d0151f00b028b4a0df21e0c8b7096f4246b )

0
source

Source: https://habr.com/ru/post/1496410/


All Articles