I am writing a very simple bash script that tars a given directory, encrypts the output of that, and then splits the resultant file into multiple smaller files since the backup media don't support huge files.
I don't have a lot of experience with bash scripting. I believe I'm having issues with quoting my variables properly to allow spaces in the parameters. The script follows:
#! /bin/bash
# This script tars the given directory, encrypts it, and transfers
# it to the given directory (likely a USB key).
if [ $# -ne 2 ]
then
echo "Usage: `basename $0` DIRECTORY BACKUP_DIRECTORY"
exit 1
fi
DIRECTORY=$1
BACKUP_DIRECTORY=$2
BACKUP_FILE="$BACKUP_DIRECTORY/`date +%Y-%m-%dT%H-%M-%S.backup`"
TAR_CMD="tar cv $DIRECTORY"
SPLIT_CMD="split -b 1024m - \"$BACKUP_FILE\""
ENCRYPT_CMD='openssl des3 -salt'
echo "$TAR_CMD | $ENCRYPT_CMD | $SPLIT_CMD"
$TAR_CMD | $ENCRYPT_CMD | $SPLIT_CMD
say "Done backing up"
Running this command fails with:
split: "foo/2009-04-27T14-32-04.backup"aa: No such file or directory
I can fix it by removing the quotes around $BACKUP_FILE
where I set $SPLIT_CMD
. But, if I have a space in the name of my backup directory it doesn't work. Also, if I copy and paste the output from the "echo" command directly into the terminal it works fine. Clearly there's something I don't understand about how Bash is escaping things.
I am not sure, but it might be worth running an eval on the commands first.
This will let bash expand the variables $TAR_CMD and such to their full breadth(just as the echo command does to the console, which you say works)
Bash will then read the line a second time with the variables expanded.
I just did a Google search and this page looks like it might do a decent job at explaining why that is needed. http://fvue.nl/wiki/Bash:_Why_use_eval_with_variable_expansion%3F
eval
is not an acceptable practice if your directory names can be generated by untrusted sources. See BashFAQ #48 for more on whyeval
should not be used, and BashFAQ #50 for more on the root cause of this problem and its proper solutions, some of which are touched on below:If you need to build up your commands over time, use arrays:
Alternately, if this is just about defining your commands in one central place, use functions:
There is a point to only put commands and options in variables.
You can relocate the commands to another file you source, so you can reuse the same commands and options across many scripts. This is very handy when you have a lot of scripts and you want to control how they all use tools. So standard_tools would contain:
Quoting spaces inside variables such that the shell will re-interpret things properly is hard. It's this type of thing that prompts me to reach for a stronger language. Whether that's perl or python or ruby or whatever (I choose perl, but that's not always for everyone), it's just something that will allow you to bypass the shell for quoting.
It's not that I've never managed to get it right with liberal doses of eval, but just that eval gives me the eebie-jeebies (becomes a whole new headache when you want to take user input and eval it, though in this case you'd be taking stuff that you wrote and evaling that instead), and that I've gotten headaches in debugging.
With perl, as my example, I'd be able to do something like:
The hard part here is doing the pipes - but a bit of IO::Pipe, fork, and reopening stdout and stderr, and it's not bad. Some would say that's worse than quoting the shell properly, and I understand where they're coming from, but, for me, this is easier to read, maintain, and write. Heck, someone could take the hard work out of this and create a IO::Pipeline module and make the whole thing trivial ;-)
Simply don't put whole commands in variables. You'll get into a lot of trouble trying to recover quoted arguments.
Also: