Run script on multiple files

2020-05-06 14:34发布

问题:

I would like to execute a script on a batch of files all of which have .xml extension.

Inspired by previous posts, I tried the following:

for file in *.xml; do ./script.sh <"$file"; done

And

for i in $(\ls -d *.xml)
do
    ./script.sh -i /path/*.xml -d /output_folder $i
done

Both of these run the script many times but only on the first .xml file in that folder. So I end up with a dozen output files but all of them are file1.txt, file1.txt_1, file1.txt_2 etc. The loop stops randomly, sometimes after 3 iterations, sometimes after 28.

Any help would be appreciated, Thank you, TP

回答1:

for f in /input_path/*.xml; do
  ./interproscan.sh -mode convert -f raw -i "$f" -d /output_path
done


回答2:

More simple and safe method is this:

while IFS=  read -r -d $'\0'; do
    ./interproscan.sh -mode convert -f raw -i "$REPLY" -d /output_path
done < <(find . -iname "*.xml" -print0)

NOTE

1) Using iname you search using case insensitive.

2) "$variable" help you if filename have some space.



回答3:

Instead of looping though the files you could use find's -exec option. It will execute the command on each file, replacing {} with the file's path. Note you must end the command with an escaped semicolon (\;).

Something like this could work for you:

find . -name "*.xml" -exec ./script.sh -i /path/*.xml -d /output_folder {} \;

But you are limited that you can only insert the {} once, alternitlvity to do it with a loop you could do this:

xmlFiles=( $(find . -name "*.xml") )
for i in ${xmlFiles[@]}
do
   ./script.sh -i /path/*.xml -d /output_folder $i
done


标签: bash loops