Algorithmic timing

We want to create a scoring algorithm that will reward top points in less time and fewer points in more time. One caveat is that there is no real range, so the time can vary from 100 milliseconds to as much as 10 minutes or more with a point range from 0 to 50.

Thanks for any help.

+4
source share
4 answers

You can simply make this a linear mapping using the following equation:

points = 50 * 100/time_in_ms

This will give you:

  • time_in_ms=100ms=> 50points
  • ...
  • time_in_ms=10min=> 0.0083points
  • ...
  • time_in_ms=+∞=> 0points

You can easily adjust the above equation if time and point ranges change.

+3

- , . . f(x) = Ax/(A + x), 0 x = 0, , A x, . ( , , - = A - Ax/(A + x)).

, A=10,

f(0) = 10 - 10*0/(10 + 0) = 10, 
f(1) = 10 - 10/11 = 9 1/11, 
f(2) = 10 - 20/12 = 8 1/3, 
f(100) = 10 - 1000/110 = 10/11 

..

+2

, :

  • , , , :

    points = [max_number_of_points]/[time]
    

    time , . , , time , , .

  • , , , - . , , , 100 100 . , , , .

0

, , , .

, , , max , , .

, ( .

:

. , , , , , 25 1 50.

, 100 , 500 , 50 * (500 - 100 ) / 500

0

Source: https://habr.com/ru/post/1526868/


All Articles