Nodejs express fs iterates files into an array or object

So, I'm trying to use the nodejs express FS module to iterate the directory in my application, store each file name in an array, which I can pass to my express view and repeat this list, but Im struggling to do this. When I do console.log inside the files.forEach loop, its printing the file name is fine, but as soon as I try to do something like:

var myfiles = []; var fs = require('fs'); fs.readdir('./myfiles/', function (err, files) { if (err) throw err; files.forEach( function (file) { myfiles.push(file); }); }); console.log(myfiles); 

it does not work, just registers an empty object. So I don’t know exactly what is happening, I think it is connected with callback functions, but if someone can get me through what I am doing wrong, and why it doesn’t work (and how to make it work), It will be much appreciated.

+6
source share
5 answers

The myfiles array is empty because the callback was not called before calling console.log ().

You need to do something like:

 var fs = require('fs'); fs.readdir('./myfiles/',function(err,files){ if(err) throw err; files.forEach(function(file){ // do something with each file HERE! }); }); // because trying to do something with files here won't work because // the callback hasn't fired yet. 

Remember that everything in node happens at the same time, in the sense that if you do not perform your processing inside your callbacks, you cannot guarantee that the asynchronous functions are not completed yet.

One way to solve this problem is to use an EventEmitter:

 var fs=require('fs'), EventEmitter=require('events').EventEmitter, filesEE=new EventEmitter(), myfiles=[]; // this event will be called when all files have been added to myfiles filesEE.on('files_ready',function(){ console.dir(myfiles); }); // read all files from current directory fs.readdir('.',function(err,files){ if(err) throw err; files.forEach(function(file){ myfiles.push(file); }); filesEE.emit('files_ready'); // trigger files_ready event }); 
+30
source

fs.readdir is asynchronous (as with many operations in node.js). This means that the console.log line will be started before readdir can call the function passed to it.

You need to either:

Put the console.log line in the callback function specified in readdir , readdir .:

 fs.readdir('./myfiles/', function (err, files) { if (err) throw err; files.forEach( function (file) { myfiles.push(file); }); console.log(myfiles); }); 

Or just follow some steps with each file inside forEach .

+5
source

As already mentioned, you use the async method, so you have a non-deterministic execution path.

However, there is an easy way around this. Just use the Sync version of the method:

 var myfiles = []; var fs = require('fs'); var arrayOfFiles = fs.readdirSync('./myfiles/'); //Yes, the following is not super-smart, but you might want to process the files. This is how: arrayOfFiles.forEach( function (file) { myfiles.push(file); }); console.log(myfiles); 

This should work the way you want. However, using synchronization instructions is not good, so you should not do this if it is not important for synchronization.

Read more here: fs.readdirSync

+3
source

I think this is due to callback functions,

That's right.

fs.readdir makes an asynchronous request to the file system for this information and calls a callback after a while with the results.

So function (err, files) { ... } does not start immediately, but console.log(myfiles) does.

At some later point in time, myfiles will contain the required information.

You should note BTW that files already an array, so it makes no sense to manually add each element to some other empty array. If the idea is to collect results from multiple calls, use .concat ; if you just want to get the data once, you can just assign myfiles = files directly.

In general, you really should read the Continuation Walkthrough Style .

+2
source

I ran into the same problem, and based on the answers provided in this post, I solved this with Promises , which seem to be perfect in this situation:

 router.get('/', (req, res) => { var viewBag = {}; // It just my little habit from .NET MVC ;) var readFiles = new Promise((resolve, reject) => { fs.readdir('./myfiles/',(err,files) => { if(err) { reject(err); } else { resolve(files); } }); }); // showcase just in case you will need to implement more async operations before route will response var anotherPromise = new Promise((resolve, reject) => { doAsyncStuff((err, anotherResult) => { if(err) { reject(err); } else { resolve(anotherResult); } }); }); Promise.all([readFiles, anotherPromise]).then((values) => { viewBag.files = values[0]; viewBag.otherStuff = values[1]; console.log(viewBag.files); // logs eg [ 'file.txt' ] res.render('your_view', viewBag); }).catch((errors) => { res.render('your_view',{errors:errors}); // you can use 'errors' property to render errors in view or implement different error handling schema }); }); 

Note. you don’t need to push the found files into the new array, because you already get the array from the fs.readdir () 'c callback. According to node docs :

The callback receives two arguments (err, files) , where the files are an array of file names in the directory, excluding ".". and "..".

I believe that this is a very elegant and convenient solution, and, most importantly, it does not require entering and processing new modules into your script.

0
source

Source: https://habr.com/ru/post/890112/


All Articles