Triggering an event with a certain probability using C #

I am trying to simulate a realistic event for the press. For this reason, I use the SendInput () method, but for a larger result, I need to specify the delay between the keyDOWN and KeyUP events! These numbers below show the elapsed time in milliseconds between the DOWN and UP events (they are valid / valid):

96 95 112 111 119 104 143 96 95 104 120 112 111 88 104 119 111 103 95 104 95 127 112 143 144 142 143 128 144 112 111 112 120 128 111 135 118 147 96 135 103 64 64 87 79 112 88 111 111 112 111 104 87 95

We can simplify the conclusion:

delay 64 - 88 ms → 20% of the time

delay 89 - 135 ms → 60% of the time

delay 136 - 150 ms → 20% of the time

How to trigger an event in accordance with the probabilities from above? Here is the code I'm using right now:

private void button2_Click(object sender, EventArgs e) { textBox2.Focus(); Random r = new Random(); int rez = r.Next(0, 5); // 0,1,2,3,4 - five numbers total if (rez == 0) // if 20% (1/5) { Random r2 = new Random(); textBox2.AppendText(" " + rez + " " + r2.Next(64, 88) + Environment.NewLine); // do stuff } else if (rez == 4)//if 20% (1/5) { Random r3 = new Random(); textBox2.AppendText(" " + rez + " " + r3.Next(89, 135) + Environment.NewLine); // do stuff } else // if 1 or 2 or 3 (3/5) -> 60% { Random r4 = new Random(); textBox2.AppendText(" " + rez + " " + r4.Next(136, 150) + Environment.NewLine); // do stuff } } 

There is a huge problem with this code. Theoretically, after millions of iterations, the resulting graph will look something like this:

comparison

How to deal with this problem?

EDIT: The solution was to use distribution at the suggestion of the people.

the Java implementation of such code is implemented here:

http://docs.oracle.com/javase/1.4.2/docs/api/java/util/Random.html#nextGaussian%28%29

and here is the C # implementation:

How to create a normally distributed random from an integer range?

although I would suggest slightly lowering the "deviation" value.

here is an interesting msdn article

http://blogs.msdn.com/b/ericlippert/archive/2012/02/21/generating-random-non-uniform-data-in-c.aspx

Thank you all for your help!

+6
source share
3 answers

It looks like you need to create a normal distribution. The built-in .NET class generates Uniform Distribution .

Random numbers by Gaussian or normal distribution are possible using the built-in Random class using the Box-Muller transform.

You should get a good probability curve like this

Normal distribution

(taken from http://en.wikipedia.org/wiki/Normal_distribution )

To convert a normalized distributed random number to an integer range, Box-Muller conversion can help with this again. See this previous question and answer for a description of the process and links to mathematical proof.

+1
source

This is the right idea, I just think you need to use doubles instead of ints so that you can split the probability space between 0 and 1. This will allow you to get a finer grain, as shown below:

  • Normalize real values ​​by dividing all values ​​by the largest value
  • Divide the values ​​into buckets - the more buckets, the closer the graph will be to the continuous case
  • Now, the larger the bucket, the greater the likelihood of an event. So, divide the interval [0,1] according to the number of elements in each bucket. So, if you have 20 real values, and there are 5 values ​​in the bucket, this takes a quarter of the interval.
  • In each test, generate a random number between 0-1 using Random.NextDouble (), and whichever bucket the random number hits, raise an event with this parameter. So, for the numbers you indicated, here are the values ​​for 5 buckets:

enter image description here

This is a bit like sample code, but hopefully it gives the right idea

+2
source

One possible approach would be to model delays as an Exponential Distribution . Exponential distribution models the time between events that occur continuously and independently at a constant average speed, which sounds like a fair assumption, given your problem.

You can estimate the lambda parameter by looking at the average of your actual observed delays, and simulate the distribution using this approach , i.e.

delay = -Math.Log (random.NextDouble ()) / lambda

However, looking at your sample, the data looks too “concentrated” around the average to be purely exponential, so simulating this way will cause delays with an appropriate average, but too common to fit your sample.

One way to solve this problem is to model the process as a shifted exponent; essentially, the process is shifted by a value that represents the minimum that can take a value, instead of 0 for the exponent. In the code, taking the shift as the minimum observed value from your sample, it might look like this:

 var sample = new List<double>() { 96, 95, 112, 111, 119, 104, 143, 96, 95, 104, 120, 112 }; var min = sample.Min(); sample = sample.Select(it => it - min).ToList(); var lambda = 1d / sample.Average(); var random = new Random(); var result = new List<double>(); for (var i = 0; i < 100; i++) { var simulated = min - Math.Log(random.NextDouble()) / lambda; result.Add(simulated); Console.WriteLine(simulated); } 

A trivial alternative that essentially resembles Aidan's approach is to re-select: select random elements from your original sample, and the result will have exactly the required distribution:

 var sample = new List<double>() { 96, 95, 112, 111, 119, 104, 143, 96, 95, 104, 120, 112 }; var random = new Random(); var size = sample.Count(); for (var i = 0; i < 100; i++) { Console.WriteLine(sample[random.Next(0, size)]); } 
+1
source

Source: https://habr.com/ru/post/914508/


All Articles