Python multiple threads / multiple processes for reading serial ports

I am trying to write a python class using parallel processing / streaming to read two serial ports (/ dev / ttyS1 and / dev / ttyS 2). Both of these ports operate at 19,200 baud and are constantly active. For this purpose I used pySerial.

Both read operations must be continuous and simultaneous. I am wondering if I should use a thread library or a thread library or a multiprocessing library. My only concern is the global locking of the interpreter, which does not provide true flow ability for heavy I / O operations. But if the global locking of the interpreter does not affect me, I will use the streaming / stream module. However, if that happens, I will need to compile python multiprocessing libraries because it is an embedded system.

So, my code is usually thread1 or process1 = reading ttyS1 and writing to the buffer after doing some string operations on the read lines. thread2 or process2 = read ttyS2 and write to another buffer after performing some string operations on read lines. Other Functions These buffers are then used by other pieces of code.

Also multiprocessing in python requires multiple kernels / cpus?

Thank you for reading!

+6
source share
2 answers

GIL is freed up during read operations, so it should not have a big impact on you. Cross-compiling multiprocessing sounds like overkill, or at least premature optimization. Keep your code modular so you can switch later.

I am sure that the streaming performance will depend on your OS. Your mileage will vary, especially in the embedded system.

If you have a spare hour, there is a talk on the GIL from David Bezley (PDF slides here ). For high-performance streaming, you want to see it to get nasty information on how streams, GIL, and the OS can work together to kill performance.

+1
source

I am not an expert on this issue, but I still think that the number of additional subtleties using threading is not worth the effort if I can parallelize through processes.

The third module that you did not mention among the alternatives is subprocess .

EDIT on request OP:. You can do parallel processing by creating separate scripts for serial interfaces. This is a quick demo, assuming both files are in the same directory.

The com.py file - a serial script - is just a layout, but the idea here is that the script works autonomously and uses only stdin and stdout to communicate with the master program.

 import sys counter = 0 while True: # The program never ends... will be killed when master is over. counter += 1 sys.stdin.readline() sys.stdout.write('Serial from com1 is %d\n' % counter) sys.stdout.flush() 

master.py file - the main program

 from subprocess import Popen, PIPE from time import sleep p = Popen(['python', './com.py'], stdin=PIPE, stdout=PIPE, stderr=PIPE) print "serial communication started." # com.py is working but we moved on! for i in range(3): p.stdin.write('<command-here>\n') print "comand sent." print "received : %s" % p.stdout.readline() sleep(1) 

Finally, this is a dump of the expected result:

 mac@jabbar :~/Desktop$ ./master.py serial communication started. comand sent. received : Serial from com1 is 1 comand sent. received : Serial from com1 is 2 comand sent. received : Serial from com1 is 3 

NTN!

+1
source

Source: https://habr.com/ru/post/902318/


All Articles