Is it possible to several commands in the backgrou

2019-07-21 00:46发布

问题:

I have a CI script that I want to speed up by running several things in the background. I want the script wait for all processes and check each one to see if it failed.

Here a a simplification:

#!/bin/bash

set -e

bg()
{
    sleep .$[ ( $RANDOM % 10 ) + 1 ]s
}

bg2()
{
    sleep .$[ ( $RANDOM % 10 ) + 1 ]s
    exit 1
}

bg &   # will pass after a random delay 
bg2 &  # will fail after a random delay


# I want the output of the program to be a failure since bg2 fails

回答1:

Yes.

You can use the wait command in bash to wait for completion on one or more sub-processes to terminate in which case we provide the PID to wait on it. Also wait can optionally take no arguments in which case it waits for all background process to terminate.

Example:-

#!/bin/bash

sleep 3 &

wait "$!"     # Feeding the non-zero process-id as argument to wait command.
              # Can also be stored in a variable as pid=$(echo $!)

# Waits until the process 'sleep 3' is completed. Here the wait 
# on a single process is done by capturing its process id

echo "I am waking up"

sleep 4 &
sleep 5 &

wait          # Without specifying the id, just 'wait' waits until all jobs 
              # started on the background is complete.

# (or) simply
# wait < <(jobs -p)     # To wait on all background jobs started with (job &)

echo "I woke up again"

Update:-

To identify the jobs when the fail, it is best to loop over the list of background jobs and log their exit-code for visibility. Thanks to wonderful suggestion by chepner. It goes like

#!/bin/bash
for p in $(jobs -p)
do
     wait "$p" || { echo "job $p failed" >&2; exit; }
done


回答2:

#!/bin/bash

set -e

bg()
{
    sleep .$[ ( $RANDOM % 10 ) + 1 ]s
}

bg2()
{
    sleep .$[ ( $RANDOM % 10 ) + 1 ]s
    exit 1
}

export -f bg
export -f bg2

parallel ::: bg bg2 || echo $? of the jobs failed