Sharing a semaphore with a function using python multiprocessing

I would like to synchronize access to a shared resource between several workers in python multiprocessing.Poolby passing a semaphore to the function. Here are some pseudo codes.

def do_work(payload, semaphore):
    with semaphore:
        access_the_shared_resource(payload)

A function is do_workdefined in the library in such a way that I cannot define a semaphore in my local area that the function can inherit. I also cannot pass the semaphore with help functools.partial, because I am multiprocessingtrying to pickle a semaphore that is not allowed. It seems to work with help multiprocessing.Managerto create a proxy for Semaphore:

manager = multiprocessing.Manager()
semaphore = manager.Semaphore()

with multiprocessing.Pool() as pool:
    results = pool.map(functools.partial(do_work, semaphore=semaphore), payloads)

Is this a better approach or am I not seeing an obvious solution?

+4
source share
1

- initializer initargs multiprocessing.Semaphore :

semaphore = None
def do_work(payload):
    with semaphore:
        return payload

def init(sem):
    global semaphore
    semaphore = sem

if __name__ == "__main__":
    sem = multiprocessing.Semaphore()
    with multiprocessing.Pool(initializer=init, initargs=(sem,)) as p:
        results = p.map(do_work, payloads)

semaphore, , ​​ multiprocessing.Semaphore() .

manager.Semaphore() , Python.

0

Source: https://habr.com/ru/post/1658352/


All Articles