Killing long Fastcgi processes

I have a Perl Dancer web application that uses the mod_fastcgi method from Apache2. The application must accept downloaded files. When the user downloads the file and clicks the Stop button, the fastcgi process freezes, working at 100%, until I manually kill the process.

Is there any parameter that can automatically kill the process that hung like that? Is there a way to automatically kill a fastcgi process that has been running for a certain time?

+6
source share
2 answers

Since the function that interests me is not an option with mod_fastcgi, and I cannot find a piece of code to wrap in Time :: Out to kill the process. I thought I would share my hacked decision.

I was looking for one linux command for this, but killall did not work (it would not specifically find the perl command that launched this server instance), and pkill also (could not indicate the age of the kill process).

So, I wrote a short perl script that runs as root to kill jobs with the correct name and age of the dancer mod_fastcgi server instances:

#!/usr/bin/perl -w use Proc::ProcessTable; $t = new Proc::ProcessTable( 'cache_ttys' => 1 ); foreach $p ( @{$t->table} ){ if ($p->cmndline =~ /perl.*dispatch.fcgi/) { my $run_time_min = $p->time/(1000000*60); if ($run_time_min >= 15) { # print "Found this job to kill: ". $p->pid . $p->cmndline."\n". $run_time_min . "\n"; kill 'KILL', $p->pid; } } } 
0
source

No, mod_fastcgi not supported .

However, you have several alternatives:

  • Wrap your perl code in a timeout module, such as Time :: Out .
  • use ulimit -t so that the kernel kills the escape process after completing its processor quota.

The second solution will be somewhat difficult to implement, since you do not want to kill the entire apache process. This is explained in more detail in the Unix StackExchange question .

+2
source

Source: https://habr.com/ru/post/958586/


All Articles