Inside a bash script, I set an environment variable to contain a string of 1 million characters. I do so like this:
export LG=XXXXXXX # ... 1 million X's
Immediately after this, I am able to echo it back without a problem, i.e.
echo $LG
However, any other unrelated commands that I attempt to run after this inside the script fail with the "Argument list too long" error. For example:
cat randomfile.txt
/bin/cat: Argument list too long
I have read through other posts that suggest using xargs to resolve such an issue, but I have not been successful. If I use any command other than echo then I get the "Argument list too long" error even if I don't actually use the $LG variable after I set it. Of course I would like to use the $LG variable, but the error occurs even if I do not use it after it is set.
Any tips would be greatly appreciated, thanks!
Edit:
The overall problem I am trying to solve is something like this:
I have a text file that I need to keep as small as possible (i.e. a few MBs). This text file contains a set of messages that are encapsulated inside a specific network protocol (i.e. header, length of message, the message itself). The message itself can be a string of characters with a length of 1 million or more. So to keep the original file size small, instead of having multiple copies of the large message inside the file, I use a mapping. I.e. if I see the letter A in the message field, I then use sed to find and replace A with 1 million X's. Like this:
cat file.txt | sed "s/A/$LG/g" # Replace A with 1 million X's
I will eventually be running this inside a (very slow) simulator, so I need this operation to complete in as few cycles as possible. In other words, a utility like awk that uses a loop with a trip count of 1 million to dynamically generate 1 million X's would be too slow. That is why I thought the environment variable solution would be best.
Command-line arguments and environment variables both come out of the same pool of space. Set environment variables too long, and you no longer have space for command-line arguments -- and even
xargs
, which breaks command line invocations down into smaller groupings to fit inside the pool where possible, can't operate when that pool is completely full.So: Don't do that. For instance, you might store your data in a file, and export the path to that file in the environment.
By the way -- the reason
echo
works is that it's built into your shell. Thus,...doesn't need to start an external process, so the limits on argument list length and environment size at process startup time don't apply.
On the other hand, if you ran
...then you'd see the problem again.
Given the explanation edited into the question as to what you're actually trying to accomplish, let me suggest an approach which requires neither environment space nor command-line space: