I need to sort a 10gb file containing a list of numbers as quickly as possible using only 100 MB of memory. I break them into pieces, and then combine them.
I am currently using C file pointers, as they go faster than files in a C ++ file (at least on my system).
I tried the 1gb file and my code works fine, but it causes a segmentation error as soon as I fscanf after opening the 10gb file.
FILE *fin;
FILE *fout;
fin = fopen( filename, "r" );
while( 1 ) {
for( i = 0; i < MAX && ( fscanf( fin, "%d", &temp ) != EOF ); i++ ) {
v[i] = temp;
}
What should i use instead?
And do you have any suggestions on how to do this in the best way?
source
share