I'm new to Bash, and I'm trying to find files in a certain set of folders. I want to create a txt report for image files in each /check/ folder.
Here's what I've been working with:
# Find images
for f in */check/ ; do
find ./ -iname "*jpg*" -o -iname "*png*" > find_images.txt
echo "finished $f"
done
I can't figure out how to only look at subfolders named "check", and I also want to pass the variable so that I get separate text files named after the parent folders. Any suggestions?
You're close, but you're not using $f
which contains the folder's name:
# Find images
for f in */check/ ; do
# Removing front-slashes from $f to use in log name
# http://mywiki.wooledge.org/BashGuide/Parameters#Parameter_Expansion
log_f="${f//\//_}"
# Only search inside $f, saving results to find_images_[foldername].txt
find "$f" -iname "*jpg*" -o -iname "*png*" > "find_images_${log_f}.txt"
echo "finished $f"
done
Use grep command and pipe it with find command
find . | grep check
The find
command supports searching for directories (folders), e.g.
find . -name "check" -type d
You could use these results to then look for the files you want. The variable $f will be the name of the folder, so use that in the inner find command. Then if you want separate output files each time through the loop, use a variable in the filename. The $f variable will have slashes in the content, so you probably don't want to use that in the name of your output file. In my example, I use a counter to make sure each output file has a unique name.
count=1
for f in `find . -name "check" -type d` ; do
find $f -iname "*jpg*" -o -iname "*png*" > find_images_$count_.txt
count=$((count+1))
done