Strange execution patterns with a subprocess.

I have a Python script in which a JAR is called. After calling the JAR, two shell scripts are called. I originally did this:

proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE) proc.wait() output, errors = proc.communicate() proc = subprocess.Popen(prune_command, shell=True) proc.wait() proc = subprocess.call(push_command, shell=True) 

I need to wait for the completion of the first two processes, so I use Popen() , and the last one that I can let run in the background, so I call() . I pass shell=True because I want called shell scripts to have access to environment variables.

The above works, however, I am not getting any entries from the JAR process. I tried calling it that way:

 proc = subprocess.call(jar_command) 

These are logs, as you would expect, but the following shell scripts are not executed. Initially, I thought that the logs were simply not collected by stdout , but it turns out that they are not executed at all. I.E. without deleting extra files or clicking on the database.

Why are subsequent shell scripts ignored?

+5
source share
2 answers

If you are specific , your shell scripts do not work at all, and everything works with the first code, then this should be a java command lock or it will not complete correctly using the call() function.

You can verify this by adding dummy file creation to your bash scripts. Put it in the first line of the script, so if it is executed, you will get a dummy file. If it is not created, this means that the scripts were not executed, possibly due to something with java execution.

I would try a couple of things:

First I returned Popen instead of call . Instead of using wait() use communicate() :

Interaction with the process: sending data to stdin. Read data from stdout and stderr until the end of the file is reached. Wait for the process to complete. communicate () returns a tuple (stdoutdata, stderrdata) .

 proc = subprocess.Popen(jar_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE) proc.communicate() 

Be sure to check both data streams (stdout and stderr). You may skip the error that caused the java process.

Then I tried to disable the buffer by pointing bufsize=0 to Popen . It will eliminate the option related to python buffering.

If both parameters still do not work, try to see if there is an exception using check_call() :

 proc = subprocess.check_call(jar_command) 

Run the command with arguments. Wait for the command to complete. If the return code was zero, then return, otherwise raise a CalledProcessError.

These parameters may have an answer; if not, they will help in the debugging process. Feel free to comment on how this progress.

+3
source

Most likely, you forget that process threads are actually OS level buffers with some finite throughput.

For example, if you start a process that produces a lot of output in PIPE mode, and you wait for it to complete before you try to use everything that this process wrote for output, you have a dead end:

  • The process has filled the output buffer and is now locked to write more data to its output. Until someone clears the buffer by reading from the pipe, the process cannot continue.
  • Your program expects the subprocess to complete before you read data from your buffer.

The right way is to start a thread in your program that will "drain" the channel constantly as the process starts and while your main thread is waiting. You must first start the process, then start the drainage flows, and then wait for the process to complete.

For differential diagnostics, check whether the subprocess will work normally with a small output (i.e. until the buffer is full, for example, a line or two).

The documentation for the subprocess contains a note about this.

0
source

Source: https://habr.com/ru/post/1272453/


All Articles