I'm trying to run some commands in paralel, in background, using bash. Here's what I'm trying to do:
forloop {
//this part is actually written in perl
//call command sequence
print `touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;`;
}
The part between backticks (``) spawns a new shell and executes the commands in succession. The thing is, control to the original program returns only after the last command has been executed. I would like to execute the whole statement in background (I'm not expecting any output/return values) and I would like the loop to continue running.
The calling program (the one that has the loop) would not end until all the spawned shells finish.
I could use threads in perl to spawn different threads which call different shells, but it seems an overkill...
Can I start a shell, give it a set of commands and tell it to go to the background?
The facility in bash that you're looking for is called
Compound Commands
. See the man page for more info:There are others, but these are probably the 2 most common types. The first, the parens, will run a list of command in series in a subshell, while the second, the curly braces, will a list of commands in series in the current shell.
parens
curly braces
run the commands in a subshell:
Run the command by using an at job:
The result will be sent to your account by mail.
Try to put commands in curly braces with &s, like this:
This does not create a sub-shell, but executes the group of commands in the background.
HTH
I stumbled upon this thread here and decided to put together a code snippet to spawn chained statements as background jobs. I tested this on BASH for Linux, KSH for IBM AIX and Busybox's ASH for Android, so I think it's safe to say it works on any Bourne-like shell.
This code runs a number of background jobs up to a certain limit of concurrent jobs. You can use this, for example, to recompress a lot of gzipped files with
xz
without having a huge bunch ofxz
processes eat your entire memory and make your computer throw up: in this case, you use*
as thefor
's list and the batch job would begzip -cd "$X" | xz -9c > "${X%.gz}.xz"
.GavinCattell got the closest (for bash, IMO), but as Mad_Ady pointed out, it would not handle the "lock" files. This should:
If there are other jobs pending, the wait will wait for those, too. If you need to wait for only the copies, you can accumulate those PIDs and wait for only those. If not, you could delete the 3 lines with "pids" but it's more general.
In addition, I added checking to avoid the copy altogether:
Incidentally, it looks like you're copying new files to an FTP repository (or similar). If so, you could consider a copy/rename strategy instead of the lock files (but that's another topic).