How to start a process and put it in the background in python?

I am currently writing my first python program (in Python 2.6.6). The program makes it easy to start and stop various applications running on the server that provide common user commands (for example, starting and stopping system services on a Linux server).

I am running application startup scripts

p = subprocess.Popen(startCommand, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) output, err = p.communicate() print(output) 

The problem is that running the script of one application stays in the foreground, and therefore p.communicate () is waiting forever. I already tried using "nohup startCommand &" before startCommand, but this did not work as expected.

As a workaround, I now use the following bash script to invoke the script application:

 #!/bin/bash LOGFILE="/opt/scripts/bin/logs/SomeServerApplicationStart.log" nohup /opt/someDir/startSomeServerApplication.sh >${LOGFILE} 2>&1 & STARTUPOK=$(tail -1 ${LOGFILE} | grep "Server started in RUNNING mode" | wc -l) COUNTER=0 while [ $STARTUPOK -ne 1 ] && [ $COUNTER -lt 100 ]; do STARTUPOK=$(tail -1 logs/SomeServerApplicationStart.log | grep "Server started in RUNNING mode" | wc -l) if (( STARTUPOK )); then echo "STARTUP OK" exit 0 fi sleep 1 COUNTER=$(( $COUNTER + 1 )) done echo "STARTUP FAILED" 

The bash script is being called from my Python code. This workaround works perfectly, but I would prefer to do everything in python ...

Subprocess. Wrong? How could I complete my task only in Python?

+5
source share
1 answer

At first it’s easy not to block the Python script in communication ... without causing communication! Just read from the output or output of the error from the command until you find the correct message and just forget about the command.

 # to avoid waiting for an EOF on a pipe ... def getlines(fd): line = bytearray() c = None while True: c = fd.read(1) if c is None: return line += c if c == '\n': yield str(line) del line[:] p = subprocess.Popen(startCommand, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) # send stderr to stdout, same as 2>&1 for bash for line in getlines(p.stdout): if "Server started in RUNNING mode" in line: print("STARTUP OK") break else: # end of input without getting startup message print("STARTUP FAILED") p.poll() # get status from child to avoid a zombie # other error processing 

The problem with the above is that the server is still a child of Python and may receive unwanted signals such as SIGHUP. If you want to make it a daemon, you first need to start a subprocess, which will then start your server. Thus, when the first child ends, the subscriber can wait for him, and the server will receive PPID 1 (accepted by the init process). You can use the multiprocessing module to facilitate this part.

The code may look like this:

 import multiprocessing import subprocess # to avoid waiting for an EOF on a pipe ... def getlines(fd): line = bytearray() c = None while True: c = fd.read(1) if c is None: return line += c if c == '\n': yield str(line) del line[:] def start_child(cmd): p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True) for line in getlines(p.stdout): print line if "Server started in RUNNING mode" in line: print "STARTUP OK" break else: print "STARTUP FAILED" def main(): # other stuff in program p = multiprocessing.Process(target = start_child, args = (server_program,)) p.start() p.join() print "DONE" # other stuff in program # protect program startup for multiprocessing module if __name__ == '__main__': main() 

One may wonder what the getlines generator getlines when the file object itself is an iterator that returns one line at a time. The problem is that it internally calls read , which is read before EOF when the file is not connected to the terminal. Since it is now connected to PIPE, you won’t get anything until the server ends ... which is not what is expected

+1
source

Source: https://habr.com/ru/post/1236310/


All Articles