Passing multiple arguments to a UNIX shell script

2019-03-31 12:15发布

I have the following (bash) shell script, that I would ideally use to kill multiple processes by name.

#!/bin/bash
kill `ps -A | grep $* | awk '{ print $1 }'`

However, while this script works is one argument is passed:

end chrome

(the name of the script is end)

it does not work if more than one argument is passed:

$end chrome firefox

grep: firefox: No such file or directory

What is going on here?

I thought the $* passes multiple arguments to the shell script in sequence. I'm not mistyping anything in my input - and the programs I want to kill (chrome and firefox) are open.

Any help is appreciated.

4条回答
再贱就再见
2楼-- · 2019-03-31 12:52

$* should be rarely used. I would generally recommend "$@". Shell argument parsing is relatively complex and easy to get wrong. Usually the way you get it wrong is to end up having things evaluated that shouldn't be.

For example, if you typed this:

end '`rm foo`'

you would discover that if you had a file named 'foo' you don't anymore.

Here is a script that will do what you are asking to have done. It fails if any of the arguments contain '\n' or '\0' characters:

#!/bin/sh

kill $(ps -A | fgrep -e "$(for arg in "$@"; do echo "$arg"; done)" | awk '{ print $1; }')

I vastly prefer $(...) syntax for doing what backtick does. It's much clearer, and it's also less ambiguous when you nest things.

查看更多
Animai°情兽
3楼-- · 2019-03-31 12:58

Look into pkill(1) instead, or killall(1) as @khachik comments.

查看更多
看我几分像从前
4楼-- · 2019-03-31 12:59

Remember what grep does with multiple arguments - the first is the word to search for, and the remainder are the files to scan.

Also remember that $*, "$*", and $@ all lose track of white space in arguments, whereas the magical "$@" notation does not.

So, to deal with your case, you're going to need to modify the way you invoke grep. You either need to use grep -F (aka fgrep) with options for each argument, or you need to use grep -E (aka egrep) with alternation. In part, it depends on whether you might have to deal with arguments that themselves contain pipe symbols.

It is surprisingly tricky to do this reliably with a single invocation of grep; you might well be best off tolerating the overhead of running the pipeline multiple times:

for process in "$@"
do
    kill $(ps -A | grep -w "$process" | awk '{print $1}')
done

If the overhead of running ps multiple times like that is too painful (it hurts me to write it - but I've not measured the cost), then you probably do something like:

case $# in
(0) echo "Usage: $(basename $0 .sh) procname [...]" >&2; exit 1;;
(1) kill $(ps -A | grep -w "$1" | awk '{print $1}');;
(*) tmp=${TMPDIR:-/tmp}/end.$$
    trap "rm -f $tmp.?; exit 1" 0 1 2 3 13 15
    ps -A > $tmp.1
    for process in "$@"
    do
         grep "$process" $tmp.1
    done |
    awk '{print $1}' |
    sort -u |
    xargs kill
    rm -f $tmp.1
    trap 0
    ;;
esac

The use of plain xargs is OK because it is dealing with a list of process IDs, and process IDs do not contain spaces or newlines. This keeps the simple code for the simple case; the complex case uses a temporary file to hold the output of ps and then scans it once per process name in the command line. The sort -u ensures that if some process happens to match all your keywords (for example, grep -E '(firefox|chrome)' would match both), only one signal is sent.

The trap lines etc ensure that the temporary file is cleaned up unless someone is excessively brutal to the command (the signals caught are HUP, INT, QUIT, PIPE and TERM, aka 1, 2, 3, 13 and 15; the zero catches the shell exiting for any reason). Any time a script creates a temporary file, you should have similar trapping around the use of the file so that it will be cleaned up if the process is terminated.

If you're feeling cautious and you have GNU Grep, you might add the -w option so that the names provided on the command line only match whole words.


All the above will work with almost any shell in the Bourne/Korn/POSIX/Bash family (you'd need to use backticks with strict Bourne shell in place of $(...), and the leading parenthesis on the conditions in the case are also not allowed with Bourne shell). However, you can use an array to get things handled right.

n=0
unset args  # Force args to be an empty array (it could be an env var on entry)
for i in "$@"
do
    args[$((n++))]="-e"
    args[$((n++))]="$i"
done
kill $(ps -A | fgrep "${args[@]}" | awk '{print $1}')

This carefully preserves spacing in the arguments and uses exact matches for the process names. It avoids temporary files. The code shown doesn't validate for zero arguments; that would have to be done beforehand. Or you could add a line args[0]='/collywobbles/' or something similar to provide a default - non-existent - command to search for.

查看更多
Luminary・发光体
5楼-- · 2019-03-31 13:10

To answer your question, what's going on is that $* expands to a parameter list, and so the second and later words look like files to grep(1).

To process them in sequence, you have to do something like:

for i in $*; do
    echo $i
done

Usually, "$@" (with the quotes) is used in place of $* in cases like this.

See man sh, and check out killall(1), pkill(1), and pgrep(1) as well.

查看更多
登录 后发表回答