In Node.js, using JSON.stringify, a "process out of memory" error occurs

With Node, I am trying to collect user data from an LDAP server and then write this data to a JSON file. For this, I use the following code:

fs.writeFile('data.json', JSON.stringify(data, null, 4)); 

The problem is that the JSON.stringify method throws the following error:

 FATAL ERROR: JS Allocation failed - process out of memory 

I know that the problem is related to JSON.stringify , because if I use console.log and not fs.writeFile , I get the same error.

I am trying to write a lot of data (over 500 entries in an LDAP database). Does anyone know how I can make this work? Here is the complete code:

 var ldap = require('ldapjs'); var util = require('util'); var fs = require('fs'); var client = ldap.createClient({ url: '************' }); client.bind('CN=**********,OU=Users,OU=LBi UK,OU=UK,DC=********,DC=local', '*********', function(err) { if (err) { console.log(err.name); } }); // taken from http://ldapjs.org/client.html client.search('OU=Users,OU=******,OU=UK,DC=******,DC=local', { scope: 'sub', filter: 'objectClass=organizationalPerson', attributes: ['givenName', 'dn', 'sn', 'title', 'department', 'thumbnailPhoto', 'manager'] // filter by organizational person }, function(err, res) { if (err) { console.log(err.name); } var limit = 1; var data = {"directory": []}; res.on('searchEntry', function(entry) { var obj = {}; entry.attributes.forEach(function (attribute) { var value; if (attribute.type === 'thumbnailPhoto') { value = attribute.buffers[0]; } else { value = attribute.vals[0]; } obj[attribute.type] = value; }); data.directory.push(obj); }); res.on('error', function(err) { console.log('error: ' + err.message); }); res.on('end', function(result) { fs.writeFile('data.json', JSON.stringify(data, null, 4)); }); }); 
+6
json ldap
Jun 25 2018-12-12T00:
source share
2 answers

As @freakish noted, the problem was that my data was too big.

The reason the data was so large is due to the large number of images that returned as objects. In the end, all I needed to do was encode the object as base64 using buffers, and then the data size became much more manageable.

+4
Jun 26 2018-12-12T00:
source share

Something happens recursively.

Make sure your data object does not contain any circular references, such as this or anything else that is difficult to serialize with.

+1
Jun 25 2018-12-12T00:
source share



All Articles