I would break the result set into smaller pieces using the LIMIT command (mySQL, I donβt know if this is on other database servers). Something like this pseudo code:
long recsToget = 50000; long got = recsToGet; long offset = 0; while ( got == recsToGet ) { got = getNextBatchFromDb( offset ); writeBatchToCsv(); offset += recsToGet;
And I would use LIMIT and OFFSET in the SQL query in the getNextBatchFromDb () function as follows:
select * from yourtable LIMIT 50000 OFFSET 100000
where OFFSET is the position to start reading, and LIMIT is the number to read.
Through this, you can read your large dataset in small chunks and update the CSV every time it is completed. You know that all records were read when getNextBatchFromDb () returns fewer rows than recsToGet.
source share