I have an array of directories to exlude from the result of my find command, something like EXCLUDE=("foo" "bar")
.
I can run this from the interactive terminal like so:
find . -name 'hvr.yml' -not -path "foo/*" -not -path "bar/*"
And so i tried to build the argument up like this:
getServersToCheck() {
# Build command arguments to exclude given dirs
local exclude_command="-not -path"
local excount=${#EXCLUDE[@]}
for ((i=0;i<excount;i++)); do
EXCLUDE[i]="${exclude_command} ${EXCLUDE[i]}/*"
done
find . -name 'hvr.yml' "${EXCLUDE[@]}"
}
But this results in find throwing an unknown predicate error: '-not -path foo/*'
Is there way of achieving this? When I echo the command it looks correct, but there must be some bash syntax rules that are causing this not to work how I expect.
UPDATE:
I added \"
around the path to exclude as I read that globbing only occurs with quoted strings. xtrace shows this following:
find . -name hvr.yml -not -path '"foo/*"' -not -path '"bar/*"'
The single quotes possibly being the problem
Removing the \"
and running with xtrace shows that the globbing is being applied in the for loop, resulting in:
find . -name hvr.yml -not -path "foo/fileinfoo" "foo/somethingelseinfoo" -not -path "bar/*" "bar/fileinbar" "bar/otherfilesinbar"
And so find is complaining about random paths being given as arguments.
Is there way to expand the EXCLUDE array and add /* to the end of each element in the command?
Found an alternative solution to what I was trying to achieve by using grep: