I was wondering if we can measure the time it takes to complete an HTTP request using node.js. By slightly changing the example from the documentation ( here ), you can easily write the following code.
var http = require('http'); var stamp1 = new Date(); var stamp2, stamp3, stamp4; var options = { hostname: 'www.google.com', port: 80, path: '/upload', method: 'POST' }; var req = http.request(options, function(res) { stamp3 = new Date(); console.log('STATUS: ' + res.statusCode); console.log('HEADERS: ' + JSON.stringify(res.headers)); res.setEncoding('utf8'); res.on('data', function (chunk) { console.log('BODY: ' + chunk); }); res.on('end', function () { stamp4 = new Date(); console.log ("Stamp 3: " + stamp3); console.log ("Stamp 4: " + stamp4); }); }); req.on('error', function(e) { console.log('problem with request: ' + e.message); });
Now let me come to my point. In the answer you can easily measure the time required for the answer, since at the beginning stamp 3 is set and the final mark 4 is set. Thus, in principle, for relatively large amounts of data, these two time stamps will be different.
However, the question I have is whether brands 1 and 2 really measure what happens when a request is prepared and sent. In other words, req.write (....) is a synchronous operation? Based on the principles of node.js, I would expect req.write (...) to be an asynchronous operation in which you can transfer an arbitrarily large document, and then upon successful completion we will receive a callback knowing that the request is complete.
Comments?
source share