I am trying to export the entire sqlite3 database table to CSV using knex.js. Since the table can contain up to 300,000 rows, I use streams to not have memory problems. But if I look at the memory usage in my application, it is up to 800 MB or I have a "out of memory" error.
How can I handle a large query result using knex.js in sqlite3 database?
Below is the sample code:
knex.select().from(table).stream(function (stream) { var stringifier = stringify(opts); var fileStream = fs.createWriteStream(file); var i = 0; stringifier.on('readable', function() { var row; while (row = stringifier.read()) { fileStream.write(row); console.log("row " + i++);
EDIT
It seems that the knex.js threads for the sqlite3 database are "fake" threads. Below is the source code of the stream function for sqlite3 in knex:
Runner_SQLite3.prototype._stream = Promise.method(function(sql, stream, options) { var runner = this; return new Promise(function(resolver, rejecter) { stream.on('error', rejecter); stream.on('end', resolver); return runner.query(sql).map(function(row) { stream.write(row); }).catch(function(err) { stream.emit('error', err); }).then(function() { stream.end(); }); }); });
We see that it is waiting for the request to complete before creating the stream from the result array.
VERSION:
Thanks for your help.
source share