Think of a two-dimensional grid, for example. 1000x1000 cells in size, which is used as a level map in the game. This map is dynamically populated with game objects at runtime. Now we need to calculate the probability of placing a new object at a given position x / y in this grid.
That I already have an array int, it keeps the number of game objects at a close distance from the cell in x / y. The index of this array represents the distance of the cell to the given cell, and each value in the array reports the number of game objects in the grid at that distance. So, for example, an array might look like this:
0, 0, 1, 2, 0, 3, 1, 0, 4, 0, 1
This would mean that 0 objects are in the grid cell along x / y itself, 0 objects are in the cells of the direct neighbor, 1 object is in the cell with a distance of two cells, 2 objects are in the cells the distance of three cells, etc. The following figure shows this example:

Now the task is to calculate how likely it is to place the new object in x / y based on the values in this array. The algorithm should be something like this:
- if at least one object is already closer than
min, then the probability should be 0.0 - else, if the object is not at a distance
max, then the probability should be 1.0 - otherwise the probability depends on how many objects are close to x / y and how many.
, : , , . , , , . , x/y - , , .
, .
?
PS: , , .