That was one of the questions I was asked at my campus placements.
There are "n" servers that are assigned with some load (in whole). We have to find out the min. the time required for their balance, so that each server has a load of 1 on it.
Each server can share its load only with its neighboring neighbors (left and right).
The total load on all servers is n.
Ex. n = 10
Initially 0,0,10,0,0,0,0,0,0,0,0,0
Time 1: 0,1,8,1,0,0,0,0,0,0,0
Time 2: 1, 1,6,1,1,0,0,0,0,0,0
.
.
Time 7: 1,1,1,1,1,1,1,1,1,1,1,1
The answer is 7.
Initially, loading on servers can be present in any way. It is not necessary that it is present on only one server, as in our current example.
How do I solve it? I could not come up with any approach.
Thanks in advance