Editing a 10gb file using limited main memory in C / C ++

I need to sort a 10gb file containing a list of numbers as quickly as possible using only 100 MB of memory. I break them into pieces, and then combine them.

I am currently using C file pointers, as they go faster than files in a C ++ file (at least on my system).

I tried the 1gb file and my code works fine, but it causes a segmentation error as soon as I fscanf after opening the 10gb file.

FILE *fin;
FILE *fout;
fin = fopen( filename, "r" );
while( 1 ) {
    // throws the error here
    for( i = 0; i < MAX && ( fscanf( fin, "%d", &temp ) != EOF ); i++ ) {
        v[i] = temp;
    }

What should i use instead?

And do you have any suggestions on how to do this in the best way?

+3
source share
1 answer

. , ( Google ).

Unix, , sort .

BTW. 2 . .

+5

Source: https://habr.com/ru/post/1788823/


All Articles