Thanks to some other great posts here (like this and that), I have several reliable methods to background a process in a Perl script.
However, in my environment (IIS6 on Windows Server 2003), it doesn't seem to completely background. Take this script:
use CGI::Pretty qw(:standard);
print header;
print start_html;
print "hi<br>";
system(1,qw(sleep 10));
print "bye";
print end_html;
The browser will stay "connected" for 10 seconds while the sleep process runs. "hi" and "bye" will both be printed on the screen, but the browser seems to be waiting for something more from the server (Firefox 4's and IE6's progress bars keep moving slowly for those 10 seconds).
I'd like the user (for my real app) to believe the process is fully back-grounded and be fully comfortable being able to close the webpage. With the progress bar still moving, I'm sure they'll sit and wait for it to stop before they close it or navigate away.
Just to be clear - the process appears backgrounded. I can close the web browser. The real bacgrouind job continues to run (sometimes for up to an hour) with no issues. I'd just like to clean this up from the UI perspective.
Thanks!
Your problem is that the CGI program is done, but the webserver hasn't been told it is done, and is waiting for that before actually saying to the browser that it is done. You'll need to figure out how to communicate that, or else come up with a hack to work around the bug. Here are a few possible strategies.
In Unix the right way to tell the webserver that it is time to go away is to daemonize the background process. The details of how to do this are pretty Unix specific. However Googling for Win32 perl daemon
lead me to Win32::Daemon and then Win32::Service, which may help you do something similar for Windows. I've seen comments without details suggesting that this will work. If you need help figuring this out, I would suggest going to Perlmonks and asking there. There is more Perl specific expertise there than here.
Suppose that you have some sort of batch job scheduler available to you. Then instead of launching a job, you can just schedule a task to be done. The task gets picked up and run. This strategy is particularly useful if the job that you need to run is intensive enough that you'd like it to run somewhere other than your webserver.
Your CGI script can turn around and issue an http request to the webserver for the job to start, which it doesn't wait for. Here is a brief example showing you how to do that:
#! /usr/bin/perl -w
use strict;
use Carp qw(croak);
use Data::Dumper qw(Dumper);
use LWP::UserAgent;
my $ua = LWP::UserAgent->new();
$ua->add_handler(response_header => sub {croak "complete"}, m_code => 200);
my $response = $ua->get("http://www.perl.com/");
if ($response->is_success) {
print "Request backgrounded\n";
}
else {
print "Uh, oh.\n";
}
CGI scripts are run with standard output redirected, either directly to the TCP socket or to a pipe. Typically, the connection won't close until the handle, and all copies of it, are closed. By default, the subprocess will inherit a copy of the handle.
There are two ways to prevent the connection from waiting on the subprocess. One is to prevent the subprocess from inheriting the handle, the other is for the subprocess to close its copy of the handle when it starts.
If the subprocess is in Perl, I think you could close the handle very simply:
close(STDOUT);
If you want to prevent the subprocess from inheriting the handle, you could use the SetHandleInformation function (if you have access to the Win32 API) or set bInheritHandles to FALSE in the call to CreateProcess. Alternatively, close the handle before launching the subprocess.