I wrote a little async script so that the package inserts many JSON files into the copied MongoDB cluster. This is my first time with this module (and I'm still learning Node.js). I do not know if I am doing this correctly.
- The code is the last part of the waterfall (1): previous functions end with objects with
db , coll and files properties. files array contains hundreds of file paths and a function to apply to each element of the array, again, a waterfall (2).- Waterfall (2) consists of the following: read, disassemble, insert. When this waterfall ends (3), I call
complete to complete the processing of one element in the array, passing an error (if any).
So far so good, right?
I do not understand what is going on inside the async.eachLimit (4) callback. From the documentation:
A callback that is called after all the iterator functions have completed or an error has occurred.
That is, when all functions are finished, calling next() (5) completes the script. But the same callback (4) is called when one error occurred, according to the documentation. This my script stops when a single file crashes.
How can i avoid this?
async.waterfall([ // 1 // ... function (obj, next) { async.eachLimit(obj.files, 1000, function (file, complete) { async.waterfall([ // 2 function (next) { fs.readFile(file, {}, function (err, data) { next(err, data); }); }, function (data, next) { // Parse (assuming all well formed) next(null, JSON.parse(data)); }, function (doc, next) { // Insert obj.coll.insert(doc, {w: 1}, function (err, doc) { next(err); }); } ], function (err, result) { // 3 complete(err); }); }, function (err) { // 4 if (err) console.error(err); next(null, obj); // 5 } ); } ], function (err, obj) { // Waterfall end if (err) console.error(err); obj.db.close(); // Always close the connection });
gremo source share