Bash command line and input limit

2018-12-31 19:33发布

Is there some sort of character limit imposed in bash (or other shells) for how long an input can be? If so, what is that character limit?

I.e. Is it possible to write a command in bash that is too long for the command line to execute? If there is not a required limit, is there a suggested limit?

3条回答
人间绝色
2楼-- · 2018-12-31 19:41

The limit for the length of a command line is not imposed by the shell, but by the operating system. This limit is usually in the range of hundred kilobytes. POSIX denotes this limit ARG_MAX and on POSIX conformant systems you can query it with

$ getconf ARG_MAX    # Get argument limit in bytes

E.g. on Cygwin this is 32000, and on the different BSDs and Linux systems I use it is anywhere from 131072 to 2621440.

If you need to process a list of files exceeding this limit, you might want to look at the xargs utility, which calls a program repeatedly with a subset of arguments not exceeding ARG_MAX.

To answer your specific question, yes, it is possible to attempt to run a command with too long an argument list. The shell will error with a message along "argument list too long".

Note that the input to a program (as read on stdin or any other file descriptor) is not limited (only by available program resources). So if your shell script reads a string into a variable, you are not restricted by ARG_MAX.

查看更多
冷夜・残月
3楼-- · 2018-12-31 19:58

Ok, Denizens. So I have accepted the command line length limits as gospel for quite some time. So, what to do with one's assumptions? Naturally- check them.

I have a Fedora 22 machine at my disposal (meaning: Linux with bash4). I have created a directory with 500,000 inodes (files) in it each of 18 characters long. The command line length is 9,500,000 characters. Created thus:

seq 1 500000 | while read digit; do
    touch $(printf "abigfilename%06d\n" $digit);
done

And we note:

$ getconf ARG_MAX
2097152

Note however I can do this:

$ echo * > /dev/null

But this fails:

$ /bin/echo * > /dev/null
bash: /bin/echo: Argument list too long

I can run a for loop:

$ for f in *; do :; done

which is another shell builtin.

Careful reading of the documentation for ARG_MAX states, Maximum length of argument to the exec functions. This means: Without calling exec, there is no ARG_MAX limitation. So it would explain why shell builtins are not restricted by ARG_MAX.

And indeed, I can ls my directory if my argument list is 109948 files long, or about 2,089,000 characters (give or take). Once I add one more 18-character filename file, though, then I get an Argument list too long error. So ARG_MAX is working as advertised: the exec is failing with more than ARG_MAX characters on the argument list- including, it should be noted, the environment data.

查看更多
高级女魔头
4楼-- · 2018-12-31 19:58

There is a buffer limit of something like 1024. The read will simply hang mid paste or input. To solve this use the -e option.

http://linuxcommand.org/lc3_man_pages/readh.html

-e use Readline to obtain the line in an interactive shell

Change your read to read -e and annoying line input hang goes away.

查看更多
登录 后发表回答