Logistic distribution sample generation

I am working on some statistical code and exploring various ways to create samples from random distributions - starting with a random number generator that generates the same floating point values ​​from 0 to 1

I know that it is possible to create approximate samples from a normal distribution by combining a sufficiently large number of independent identically distributed uniform random variables (according to the central limit theorem ).

Is it possible to do something similar to create patterns from logistic distribution ? I suppose that the samples that need to be added will need to be weighed or compared somehow, in order to prevent normal.

PS I also know that there may be more efficient ways to generate random samples, I ask a question because I'm more interested in understanding how such a generator will work, rather than efficiency ....

+3
source share
3 answers

The inverse of the logistic distribution is not difficult to find, so you can use the Inverse Transform Pattern . The main algorithm:

for each random variate x ~ logistic
  generate a random variate y ~ Uniform(0, 1)
  x := F^-1 (y)

where F ^ -1 is the inverse CDF for the logistic or desired distribution. Most programming languages ​​will allow you to generate a uniform variable between 0 and 1 through some rand function.

python, 1000 :

from random import random
import math
import pylab

loc, scale = 0, 1

randvars = []
for i in range(1000):
    x = random()
    y = loc + scale * math.log(x / (1-x))
    randvars.append(y)

pylab.hist(randvars)
+10

, cdf.

- cdf - , [0,1], cdf - , [0,1] . cdfs, , , cdf.

-, , , [0,1], cdf. , .

cdf .

+6

The logistic distribution is the difference of the two Gumbel distributions, the variations of which are the negative logarithm of exponential variations or equivalently log(u/(1.0 - u)), where uis the uniform variation.

+1
source

Source: https://habr.com/ru/post/1770024/


All Articles