Random time delay

I am trying to send signals between processes, and I need to allow a random time delay between 0.01 and 0.1 seconds in my signal generation cycle. So this is what I am doing, and this, of course, is not between 0.01 and 0.1, but comes out in 1 second. Not sure what I'm missing here.

sleepTime = 100000L+(long)((1e6-1e5)*rand()/(RAND_MAX+1.0)); usleep(sleepTime); 
+6
source share
4 answers

If you have C ++ 11:

 #include <thread> #include <random> #include <chrono> int main() { std::mt19937_64 eng{std::random_device{}()}; // or seed however you want std::uniform_int_distribution<> dist{10, 100}; std::this_thread::sleep_for(std::chrono::milliseconds{dist(eng)}); } 

Perhaps this is not what your professional is looking for. :-)

+11
source

All your constants are 10 times bigger! Try

  sleepTime = 10000L+(long)((1e5-1e4)*rand()/(RAND_MAX+1.0)); 
+1
source

Three things:

1) 100000L is 100 ms, not 10 ms

2) A smile is guaranteed only for sleep, at least for the time of the argument, without guaranteeing how long he will sleep. Read the manpage: all sleep functions make this statement.

3) How do you measure how long it takes? Do you use a millisecond resolution timer?

0
source

The simplest thing is to create a random number from 0 to 1, and then use the lerp function to move the number from 0.01 to 0.1.

 // returns random between 0 and 1 float randFloat() ; // you are "val" % of the way from min to max float lerp( float val, float min, float max ) ; 
0
source

Source: https://habr.com/ru/post/898176/


All Articles