Bash command to see if any files in dir - test if

2019-07-15 08:47发布

I have the following bash script:

if ls /Users/david/Desktop/empty > /dev/null
then
    echo 'yes -- files'
else
    echo 'no -- files'
fi

How would I modify the top line such that it evaluates true if there are one or more files in the /Users/david/Desktop/empty dir?

标签: bash shell unix
3条回答
戒情不戒烟
2楼-- · 2019-07-15 09:10

Robust pure Bash solutions:

For background on why a pure Bash solution with globbing is superior to using ls, see Charles Duffy's helpful answer, which also contains a find-based alternative, which is much faster and less memory-intensive with large directories.[1]
Also consider anubhava's equally fast and memory-efficient stat-based answer, which, however, requires distinct syntax forms on Linux and BSD/OSX.

Updated to a simpler solution, gratefully adapted from this answer.

# EXCLUDING hidden files and folders - note the *quoted* use of glob '*'
if compgen -G '*' >/dev/null; then
  echo 'not empty'
else
  echo 'empty, but may have hidden files/dirs.'
fi
  • compgen -G is normally used for tab completion, but it is useful in this case as well:

    • Note that compgen -G does its own globbing, so you must pass it the glob (filename pattern) in quotes for it to output all matches. In this particular case, even passing an unquoted pattern up front would work, but the difference is worth nothing.

    • if nothing matches, compgen -G always produces no output (irrespective of the state of the nullglob option), and it indicates via its exit code whether at least 1 match was found, which is what the conditional takes advantage of (while suppressing any stdout output with >/dev/null).

# INCLUDING hidden files and folders - note the *unquoted* use of glob *
if (shopt -s dotglob; compgen -G * >/dev/null); then
  echo 'not empty'
else
  echo 'completely empty'
fi
  • compgen -G never matches hidden items (irrespective of the state of the dotglob option), so a workaround is needed to find hidden items too:

    • (...) creates a subshell for the conditional; that is, the commands executed in the subshell don't affect the current shell's environment, which allows us to set the dotglob option in a localized way.

    • shopt -s dotglob causes * to match hidden items too (except for . and ..).

    • compgen -G * with unquoted *, thanks to up-front expansion by the shell, is either passed at least one filename, whether hidden or not (additional filenames are ignored) or the empty string, if neither hidden nor non-hidden items exists. In the former case the exit code is 0 (signaling success and therefore a nonempty directory), in the later 1 (signaling a truly empty directory).


[1] This answer originally falsely claimed to offer a Bash-only solution that is efficient with large directories, based on the following approach: (shopt -s nullglob dotglob; for f in "$dir"/*; do exit 0; done; exit 1). This is NOT more efficient, because, internally, Bash still collects all matches in an array first before entering the loop - in other words: for * is not evaluated lazily.

查看更多
趁早两清
3楼-- · 2019-07-15 09:26

This is covered in detail in BashFAQ #004. Notably, use of ls for this purpose is an antipattern and should be avoided.

shopt -s dotglob   # if including hidden files is desired
files=( "$dir"/* )
[[ -e $files || -L $files ]] && echo "Directory is not empty"

[[ -e $files ]] doesn't actually check if the entire array's contents exist; rather, it checks the first name returned -- which handles the case when no files match, wherein the glob expression itself is returned as the sole result.


Notably:

  • This is far faster than invoking ls, which requires using fork() to spawn a subshell, execve() to replace that subshell with /bin/ls, the operating system's dynamic linker to load shared libraries used by the ls binary, etc, etc. [An exception to this is extremely large directories, of tens of thousands of files -- a case in which ls will also be slow; see the find-based solution below for those].
  • This is more correct than invoking ls: The list of files returned by globbing is guaranteed to exactly match the literal names of files, whereas ls can munge names with hidden characters. If the first entry is a valid filename, "${files[@]}" can be safely iterated over with assurance that each returned value will be a name, and there's no need to worry about filesystems with literal newlines in their names inflating the count if the local ls implementation does not escape them.

That said, an alternative approach is to use find, if you have one with the -empty extension (available both from GNU find and from modern BSDs including Mac OS):

[[ $(find -H "$dir" -maxdepth 0 -type d -empty) ]] || echo "Directory is not empty"

...if any result is given, the directory is nonempty. While slower than globbing on directories which are not unusually large, this is faster than either ls or globbing for extremely large directories not present in the direntry cache, as it can return results without a full scan.

查看更多
Rolldiameter
4楼-- · 2019-07-15 09:27

Here is a solution based on stat command that can return number of hard links if run against a directory (or link to a directory). It starts incrementing number of hard links from 3 as first two are . and .. entries thus subtracting 2 from this number gives as actual number of entries in the given directory (this includes symlinks as well).

So putting it all together:

(( ($(stat -Lc '%h' "$dir") - 2) > 0)) && echo 'not empty' || echo 'empty'

As per man stat options used are:

%h     number of hard links
-L     --dereference, follow links

EDIT: To make it BSD/OSX compatible use:

(( ($(stat -Lf '%l' "$dir") - 2) > 0)) && echo 'not empty' || echo 'empty'
查看更多
登录 后发表回答