Why do these random numbers slowly gravitate toward a negative value?

The program should do the following:

  • Initializes a grid of values, "ActualGridValues", (X rows in Y columns) equal to 0.
  • Creates another grid of values, "RandomGridValues", (X rows in Y columns), defines each cell in the grid as equal to the average of N random numbers between the range from -Z to Z.
  • Adds a number from RandomGridValues ​​to the corresponding cell from ActualGridValues.
  • For each cell in ActualGridValues, it also adds the values ​​of neighboring cells.
  • Repeats steps 2-4 to infinity.

As a result, random structures merge into a grid. Groups with high positive values ​​will affect neighboring cells and cause them to skew, while groups with low negative values ​​will do the same as skew.

The total result of all values ​​in all cells MUST be zero. Despite the fact that there are localized groups with a high level of skew and a low number of skews, with a sufficiently large sample size, all cells must be equal to zero.

The problem is that, say, 1000 iterations, the values ​​sequentially distort the negative result. You have structures and localized highs and lows, but overall values ​​are always distorted negatively. (This means that over time, the entire grid is filled with only negative numbers).

Each time a simulation is performed, the values ​​are distorted negatively. Any thoughts on why?

Change I highlighted the problem before the next function. The average of all numbers in RandomGridValue almost always turns out to be negative.

//For every cell value, this function generates a string of random numbers between -RandMax/2 and +RandMax/2. It then takes the average of all of these numbers. It assigns that value to a cell in the RandomGridValue array. function AddChaos() { for (var l = 0; l < GridX; l++) { for (var m = 0; m < GridY; m++) { var RandomGrid = 0; for (var n = 0; n < RandNums; n++) { RandomGrid = RandomGrid + Math.random() * RandMax * 2 - RandMax; } RandomGridValue[l][m] = RandomGrid / RandNums; } } } 
+4
source share
1 answer

Thanks to the floating point standard that Javascript implements, performing arithmetic with decimal values ​​is an error if not to say ...

One job is to convert decimal numbers to integers, multiplying them by 100, doing the math, then dividing it by 100.

This only works if you have no more than two tenths. If you need higher accuracy, I would recommend a language other than Javascript for this part.

+2
source

Source: https://habr.com/ru/post/1468742/


All Articles