Multiprocessor and Sockets in Python

I am trying to do multiprocessing programming and the socket work together, but, I'm stuck at this point. The problem is that I get this error:

File "multiprocesssockserv.py", line 11, in worker clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM) error: [Errno 9] Bad file descriptor 

The full error code is as follows:

 import multiprocessing as mp import logging import socket logger = mp.log_to_stderr(logging.WARN) def worker(queue): while True: clientfileno = queue.get() print clientfileno clientsocket = socket.fromfd(clientfileno, socket.AF_INET, socket.SOCK_STREAM) clientsocket.recv() clientsocket.send("Hello World") clientsocket.close() if __name__ == '__main__': num_workers = 5 socket_queue = mp.Queue() workers = [mp.Process(target=worker, args=(socket_queue,)) for i in range(num_workers)] for p in workers: p.daemon = True p.start() serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) serversocket.bind(('',9090)) serversocket.listen(5) while True: client, address = serversocket.accept() socket_queue.put(client.fileno()) 

edit: I use socket.fromfd because I cannot queue sockets :) I need a way to access the same sockets from different processes. This is the core of my problem.

+4
source share
3 answers

After working on this for a while, I decided to approach this problem from a different angle, and the following method works for me.

 import multiprocessing as mp import logging import socket import time logger = mp.log_to_stderr(logging.DEBUG) def worker(socket): while True: client, address = socket.accept() logger.debug("{u} connected".format(u=address)) client.send("OK") client.close() if __name__ == '__main__': num_workers = 5 serversocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) serversocket.bind(('',9090)) serversocket.listen(5) workers = [mp.Process(target=worker, args=(serversocket,)) for i in range(num_workers)] for p in workers: p.daemon = True p.start() while True: try: time.sleep(10) except: break 
+7
source

I am not an expert, so I can’t give a real explanation, but if you want to use queues, you need to reduce the descriptor and then recreate it:

in the main :

 client, address = serversocket.accept() client_handle = multiprocessing.reduction.reduce_handle(client.fileno()) socket_queue.put(client_handle) 

and your employee:

 clientHandle = queue.get() file_descriptor = multiprocessing.reduction.rebuild_handle(client_handle) clientsocket = socket.fromfd(file_descriptor, socket.AF_INET, socket.SOCK_STREAM) 

and

 import multiprocessing.reduction 

This will work with your source code. However, I am having problems closing sockets in workflows after they have been created, as I described.

+2
source

Here is the working code on what was mentioned above - https://gist.github.com/sunilmallya/4662837 a multiprocessing ripening server with parent processing that transfers connections to the client after receiving connections

0
source

Source: https://habr.com/ru/post/1386826/


All Articles