At what point does importing become the right decision?

This weekend I was working on a project, and I needed to use binomial distribution to check the probability of an event (the probability that x of y characters will be alphanumeric, given random bytes). My first decision was to write the test myself, as it is quite simple.

def factorial(n):
    if n == 0:
        return 1
    else:
        return n * factorial(n-1)

def binomial_prob(n,k,p):
    bin_coeff = (factorial(n))/(factorial(k)*factorial(n-k))
    return = bin_coeff * pow(p,k) * pow((1 - p),(n-k))

And I used it. However, SciPy includes a binom_test method that does just that. But for distribution, this is likely to significantly increase the size (both SciPy and NumPy are required), and this is for a relatively simple test. I believe the supporting question is how smart is py2exe. It just imports the modules that I use from SciPy and NumPy, or for all libraries. I expect only the modules that I mention, but I think the next question is how many modules SciPy.stats depends on. But I got distracted ... So my question is when should I use code already written at the cost of including much more than I need, and when should I just write my own implementation?

(I marked this as python, but I suggest that this might be a more general question)

+3
2

", , , "

.

?

.

" , ", , . , ""?

, , - .

- - . . . . . .

- , . . .

+5

, . Unix/Linux, , , , , . - , . Windows , , , , - , , ..

py2exe, , Windows. , ( , , - Win32 ) , py2exe ; , . - , . , .

, . , " ", , , , . , , , , Python , .

0

Source: https://habr.com/ru/post/1734879/


All Articles