Run all shell scripts in folder

2019-04-05 17:52发布

问题:

I have many .sh scripts in a single folder and would like to run them one after another. A single script can be execute as:

bash wget-some_long_number.sh -H

Assume my directory is /dat/dat1/files

How can I run bash wget-some_long_number.sh -H one after another?

I understand something in these line should work:

for i in *.sh;...do ....; done

回答1:

Try this:

for f in *.sh; do  # or wget-*.sh instead of *.sh
  bash "$f" -H   || break # if needed 
done

It you want to run, e.g., x1.sh, x2.sh, ..., x10.sh:

for i in `seq 1 10`; do
  bash "x$i.sh" -H   || break # if needed 
done


回答2:

there is a much simpler way, you can use the run-parts command which will execute all scripts in the folder:

run-parts /path/to/folder


回答3:

I ran into this problem where I couldn't use loops and run-parts works with cron.

Answer:

foo () {
    bash -H $1 
    #echo $1
    #cat $1
}
cd /dat/dat1/files #change directory
export -f foo #export foo
parallel foo ::: *.sh #equivalent to putting a & in between each script

You use GNU parallel, this executes everything in the directory, with the added buff of it happening at a lot faster rate. Not to mention it isn't just with script execution, you could put any command in the function and it'll work.



标签: linux bash shell