I want to process a large number of records (> 400k) in a package and insert them into the database.
I know how to iterate over an array using for () or underscore.each (), and I also know how to asynchronously insert a record into different (not) SQL databases. This is not a problem - the problem is that I cannot figure out a way to do both at the same time.
The distribution of the database itself does not play a role here, the principle applies to any (NO) SQL database with an asynchronous interface.
I am looking for a template to solve the following problem:
Loop Method:
var results = [];
_.each(results,function(row){
var newObj = prepareMyData(row);
db.InsertQuery(newObj,function(err,response) {
if(!err) console.log('YAY, inserted successfully');
});
});
This approach is clearly flawed. This clogs the database with insert requests without waiting for one to complete. Speaking of MySQL adapters using the connection pool, you will end the connections pretty soon and the script will not work.
:
var results = [];
var index = 0;
var myRecursion = function()
{
var row = results[index];
var data = prepareMyData(row);
db.InsertQuery(data,function(err, response)
{
if (!err)
{
console.log('YAY, inserted successfully!');
index++;
if (index < results.length) myRecursion();
}
}
}
myRecursion();
( , , , ), - .
, PHP, . , , nodejs - .
?