All,
I am running BASH in Solaris 10
I have the following shell script that loops in a directory depending on the presence of CSV files.
The problem is with this piece of code is that it still does one loop even if there is no CSV files in that directory and then calls SQL loader.
SQLLoader then produces a log file because there is no file to process and this is beginning to mess up my directory filling it with log files.
for file in *.csv ;
do
echo "SQLLoader is reading : " $file
sqlldr <User>/<Password>@<DBURL>:<PORT>/<SID> control=sqlloader.ctl log=$inbox/$file.log data=$inbox/$file
done
How do I stop it going into a loop if there is no CSV files in that directory of $inbox
Use
find
to search filesFirst off, using
nullglob
is the correct answer if it is available. However, a POSIX-compliant option is available.The pattern will be treated as literal text if there are no matches. You can catch this with a small hack:
When there are no matches,
file
will be set to the literal string*.csv
, which is not the name of a file, so-f "$file"
will fail. Otherwise,file
will be set in turn to the name of each file matching the pattern, and-f "$file"
will succeed every time. Note this will work even if there is an file named*.csv
. The drawback is that you have to make a redundant test for each existing file.Say:
before your
for
loop.This is not the default, and saying
for file in *.csv
when you don't have any matching files expands it to*.csv
.Quoting from the documentation: