Why does Parallel Python work the way it does?

In Parallel Python, why do you need to wrap all the modules, the transferred function will be needed along with the variables and namespaces in the job sending call - how much it is necessary to save the global variables of the module level? (if all this happens)

send function:

submit(self, func, args=(), depfuncs=(), modules=(), callback=None, callbackargs=(),group='default', globals=None)
    Submits function to the execution queue

    func - function to be executed
    args - tuple with arguments of the 'func'
    depfuncs - tuple with functions which might be called from 'func'
    modules - tuple with module names to import
    callback - callback function which will be called with argument 
        list equal to callbackargs+(result,) 
        as soon as calculation is done
    callbackargs - additional arguments for callback function
    group - job group, is used when wait(group) is called to wait for
    jobs in a given group to finish
    globals - dictionary from which all modules, functions and classes
    will be imported, for instance: globals=globals()
+3
source share
1 answer

, pp , , , Python , , . , __future__ , . , , pp, . pp , , , , .

, , , , pp, , pp .

+3

Source: https://habr.com/ru/post/1772577/


All Articles