I performed the same addition in PHP and NodeJS. PHP calculated correctly up to 1 billion iterations, but NodeJS correctly calculated only up to 100 million iterations.
This is the PHP code:
<?php
$start_time = time();
$j = 0;
for ($i = 0; $i <= 1000000000; $i++) {
$j += $i;
}
$end_time = time();
echo "Time Taken: " . ($end_time - $start_time);
echo "\n";
echo "i: " . $i;
echo "\n";
echo "j: " . $j;
echo "\n";
?>
Which returned the following output:
Time Taken: 15
i: 1000000001
j: 500000000500000000
This is the NodeJS code:
var epoch = require('epoch.js');
var start_time = epoch().format('ss');
var i;
var j = 0;
for (i = 0; i <= 1000000000; i++) {
j += i;
}
var end_time = epoch().format('ss');
var time_taken = end_time - start_time;
console.log("Time Taken: " + time_taken + " \ni: " + i + "\nj: " + j);
Which returned the following output:
Time Taken: 1
i: 1000000001
j: 500000000067109000
Why is the variable j
incorrect in NodeJS?
Edit:
As pointed out by @SalmanA (and confirmed by me), I would also like to ask why the maximum integer limit remains unchanged for Javascript on different processors, but changes for PHP?
source
share