Run multiple exec commands at once (But wait for t

2019-02-03 10:38发布

I've looked around for this and I can't seem to find anyone who is trying to do exactly what I am.

I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:

    public function executeAction(){
    //code to parse the _POST variable into an array called devices

    foreach($devices as $devID){
        exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2'], $execout[$devID]);
    }
    print_r($execout);
}

Obviously this code is just an excerpt with big chunks removed, but hopefully it's enough to demonstrate what I'm trying to do.

I need to run all of the execs at once and I need to wait for them all to complete before returning. I also need the output of all of the scripts stored in the array called $execout.

Any ideas?

Thanks!!!

5条回答
仙女界的扛把子
2楼-- · 2019-02-03 11:05

Quoting PHP Documentation :

Note:

If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.

So, you can exec in background if you redirect the output in a file :

exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2']." > outputfile.txt", $execout[$devID]);

But if you want to wait that ALL execs are finished before continuing, you have to make a call back from the external script. Maybe like this :

exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2']." > ../path/to/outputfile.txt; ".PHP_BINDIR.DIRECTORY_SEPARATOR."php ../path/to/callback.php", $execout[$devID]);

Like this, your callback.php script will be called after every execs of script.tcl.

Maybe you can do something with these tricks.

查看更多
Lonely孤独者°
3楼-- · 2019-02-03 11:09

PHP's exec function will always wait for a response from your execution. However you can send the stdout & stderror of the process to /dev/null (on unix) and have these all the scripts executed almost instantly. This can be done by adding..

 '> /dev/null 2>&1 &'

To the end of your execution string.

But! that means they'll fork off and finish processing independently. It may be worth having them write a response back somewhere. And you could create a listener to pick this up and process it.

查看更多
乱世女痞
4楼-- · 2019-02-03 11:14

If you put your exec() call in a separate script, you can call that external script multiple times in parallel using curl_multi_exec(). That way, you'd make all the calls in separate requests, so they could execute simultaneously. Poll &$still_running to see when all requests have finished, after which you can collect the results from each.

Update: Here's a blog post detailing exactly what I'm describing.


Example

Based on the blog post linked above, I put together the following example.

Script being run in parallel:

// waitAndDate.php

<?php
sleep((int)$_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));

Script making calls in parallel:

// multiExec.php

<?php
$start = microtime(true);

$mh = curl_multi_init();
$handles = array();

// create several requests
for ($i = 0; $i < 5; $i++) {
    $ch = curl_init();

    $rand = rand(5,25); // just making up data to pass to script
    curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_TIMEOUT, 30);

    curl_multi_add_handle($mh, $ch);
    $handles[] = $ch;
}

// execute requests and poll periodically until all have completed
$isRunning = null;
do {
    curl_multi_exec($mh, $isRunning);
    usleep(250000);
} while ($isRunning > 0);

// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
    $outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
    curl_multi_remove_handle($mh, $handles[$i]);
}

curl_multi_close($mh);

print_r($outputs);
printf("Elapsed time: %.2f seconds\n", microtime(true) - $start);

Here is some output I received when running it a few times:

Array
(
    [0] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [1] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [2] => 18 secs; Mon Apr  2 19:01:43 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:01:36 UTC 2012
    [4] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
)
Elapsed time: 18.36 seconds

Array
(
    [0] => 22 secs; Mon Apr  2 19:02:33 UTC 2012
    [1] => 9 secs; Mon Apr  2 19:02:20 UTC 2012
    [2] => 8 secs; Mon Apr  2 19:02:19 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:02:22 UTC 2012
    [4] => 7 secs; Mon Apr  2 19:02:18 UTC 2012
)
Elapsed time: 22.37 seconds

Array
(
    [0] => 5 secs; Mon Apr  2 19:02:40 UTC 2012
    [1] => 18 secs; Mon Apr  2 19:02:53 UTC 2012
    [2] => 7 secs; Mon Apr  2 19:02:42 UTC 2012
    [3] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
    [4] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
)
Elapsed time: 18.35 seconds

Hope that helps!

One side note: make sure your web server can process this many parallel requests. If it serves them sequentially or can only serve very few simultaneously, this approach gains you little or nothing. :-)

查看更多
Summer. ? 凉城
5楼-- · 2019-02-03 11:30

You need to modify your script a bit

  1. Save post data to session
  2. Exec , save result to session
  3. Use redirection by using JavaScript
  4. Echo redirect command after exec return, with same URL but add incremental index such as ?index=99
  5. When index reaches end, show whole result
查看更多
祖国的老花朵
6楼-- · 2019-02-03 11:30

Look at ExecFuture andFutureIterator in the libputil library:

https://secure.phabricator.com/book/libphutil/class/ExecFuture/

It does exactly what you need with a pretty nice syntax:

$futures = array();
foreach ($files as $file) {
  $futures[$file] = new ExecFuture("gzip %s", $file);
}
foreach (new FutureIterator($futures) as $file => $future) {
  list($err, $stdout, $stderr) = $future->resolve();
  if (!$err) {
    echo "Compressed {$file}...\n";
  } else {
    echo "Failed to compress {$file}!\n";
  }
}
查看更多
登录 后发表回答