Python performance when passing big data as function parameter

Suppose I have two Python functions, Function 1 and Function 2 .

Function 1 will call Function 2 , and the parameter will call big data (for example, a dictionary with 100 thousand elements).

I am wondering if there are performance differences between calling Function 2 in Function 1 , which means that I need to pass a large data parameter and implement Function 2 in Function 1 , which means I do not need to pass a large data parameter.

Thanks.

PS: I think the key question is how does Python pass a parameter, by value or by reference (pointer)?

Edit: This seems like a confusing issue. How to pass a variable by reference? - good answer.

+6
source share
2 answers

Python passes object references by value. The terminology is contradictory and ugly, but there should be no real difference in performance.

Check out these answers for all the details you've ever wanted (hopefully).

+8
source

The terminology of how python goes is a vicious discussion that I don't want to go into. But what is actually passed on the stack is a link. Thus, the cost of one of your options is small.

+1
source

Source: https://habr.com/ru/post/944666/


All Articles