Only one process prints in unix, multiprocess python

I have a script where I upload a file that takes some time, because there is enough data to read and so that the user can not complete the process, I want to show some indication of loading. I thought it was a good opportunity to learn how to use the multiprocessing module, so I wrote this example to test the module:

import time, multiprocessing def progress(): delay = 0.5 while True: print "Loading.", time.sleep(delay) print "\b.", time.sleep(delay) print "\b.", time.sleep(delay) print "\r \r", return def loader(filename, con): # Dummy loader time.sleep(5) con.send(filename) con.close() return if __name__ == "__main__": parrent_con, child_con = multiprocessing.Pipe() filename = "main.key" p1 = multiprocessing.Process(target=progress) p2 = multiprocessing.Process(target=loader, args=(filename, child_con)) p1.start() p2.start() data = parrent_con.recv() p1.terminate() print "\n", data 

It works as I expect, when I run it in windows cmd, it prints “Download” and sequentially adds points until the bootloader completes. But on unix, where I need it, I don't get any result from the progress function, process p1.

+6
source share
1 answer

Like Mark and Duckaw, the problem is buffering. Here are some possible solutions:

  • Using python -u to run a script

    python -u will disable stdout and stderr. This is the easiest solution if it is acceptable to you.

  • Using sys.stdout.flush

    sys.stdout.flush will push stdout out of the buffer.

     delay = 0.5 while True: print("Loading."), sys.stdout.flush() time.sleep(delay) print("\b."), sys.stdout.flush() time.sleep(delay) print("\b."), sys.stdout.flush() time.sleep(delay) print("\r \r"), sys.stdout.flush() return 
+3
source

Source: https://habr.com/ru/post/987251/


All Articles