I'm trying to do something like the following:
for file in `find . *.foo`
do
somecommand $file
done
But the command isn't working because $file is very odd. Because my directory tree has crappy file names (including spaces), I need to escape the find
command. But none of the obvious escapes seem to work:
-ls
gives me the space-delimited filename fragments
-fprint
doesn't do any better.
I also tried: for file in "
find . *.foo -ls"; do echo $file; done
- but that gives all of the responses from find in one long line.
Any hints? I'm happy for any workaround, but am frustrated that I can't figure this out.
Thanks, Alex
(Hi Matt!)
It does get messy if you need to run a number of shell commands on each item, though.
Instead of relying on the shell to do that work, rely on find to do it:
Then the file name will be properly escaped, and never interpreted by the shell.
You have plenty of answers that explain well how to do it; but for the sake of completion I'll repeat and add to it:
xargs
is only ever useful for interactive use (when you know all your filenames are plain - no spaces or quotes) or when used with the-0
option. Otherwise, it'll break everything.find
is a very useful tool; put using it to pipe filenames intoxargs
(even with-0
) is rather convoluted asfind
can do it all itself with either-exec command {} \;
or-exec command {} +
depending on what you want:The former runs
somecommand
with one argument for each file recursively in/path
that matchespattern
.The latter runs
somecommand
with as many arguments as fit on the command line at once for files recursively in/path
that matchpattern
.Which one to use depends on
somecommand
. If it can take multiple filename arguments (likerm
,grep
, etc.) then the latter option is faster (since you runsomecommand
far less often). Ifsomecommand
takes only one argument then you need the former solution. So look atsomecommand
's man page.More on
find
: http://mywiki.wooledge.org/UsingFindIn
bash
,for
is a statement that iterates over arguments. If you do something like this:you're giving
for
one argument to iterate over (note the quotes!). If you do something like this:you're asking
bash
to take the contents ofbar
and tear it apart wherever there are spaces, tabs or newlines (technically, whatever characters are inIFS
) and use the pieces of that operation as arguments to for. That is NOT filenames. Assuming that the result of a tearing long string that contains filenames apart wherever there is whitespace yields in a pile of filenames is just wrong. As you have just noticed.The answer is: Don't use
for
, it's obviously the wrong tool. The abovefind
commands all assume thatsomecommand
is an executable inPATH
. If it's abash
statement, you'll need this construct instead (iterates overfind
's output, like you tried, but safely):This uses a
while-read
loop that reads parts of the stringfind
outputs until it reaches aNULL
byte (which is what-print0
uses to separate the filenames). SinceNULL
bytes can't be part of filenames (unlike spaces, tabs and newlines) this is a safe operation.If you don't need
somebashstatement
to be part of your script (eg. it doesn't change the script environment by keeping a counter or setting a variable or some such) then you can still usefind
's-exec
to run yourbash
statement:Here, the
-exec
executes abash
command with three or more arguments.--
.bash
will put this in$0
, you can put anything you like here, really.{} \;
or{} +
respectively). The filename(s) end(s) up in$1
(and$2
,$3
, ... if there's more than one, of course).The
bash
statement in the firstfind
command here runssomebashstatement
with the filename as argument.The
bash
statement in the secondfind
command here runs afor
(!) loop that iterates over each positional parameter (that's what the reducedfor
syntax -for foo; do
- does) and runs asomebashstatement
with the filename as argument. The difference here between the very firstfind
statement I showed with-exec {} +
is that we run only onebash
process for lots of filenames but still onesomebashstatement
for each of those filenames.All this is also well explained in the
UsingFind
page linked above.xargs is your friend. You will also want to investigate the -0 (zero) option with it.
find
(with-print0
) will help to produce the list. The Wikipedia page has some good examples.Another useful reason to use
xargs
, is that if you have many files (dozens or more), xargs will split them up into individual calls to whatever xargs is then called upon to run (in the first wikipedia example,rm
)I had to do something similar some time ago, renaming files to allow them to live in Win32 environments:
This is probably a little simplistic, doesn't avoid name collisions, and I'm sure it could be done better -- but this does remove the need to use basename on the find results (in my case) before performing my sed replacement.
I might ask, what are you doing to the found files, exactly?