Asynchronous file writing and reading

I have two processes.

  • One process redirects the output of some unix command to a server-side file. Data is always appended to the file. eg.

    find / > tmp.txt
    
  • Another process is opening and reading the same file and saving it in a line and sending the entire line to the client.

Now this is happening simultaneously. I am using python.

Any suggestion as in what might be a possible way to implement this scenario. Please explain the code example.

Thanks in advance.

Tazim.

+3
source share
1 answer

If you want to have the output of a Unix command in a file and display it at the same time, you can [tee][1]write it to stdout and read it from there, for example:

>>> command_line = '/bin/find / |tee tmp.txt'
>>> args = shlex.split(command_line)
>>> p = subprocess.Popen(args,stdout=subprocess.PIPE)

commuicate() stdout POpen. , .

+1

Source: https://habr.com/ru/post/1750426/


All Articles