Question

bash: start multiple chained commands in background

I'm trying to run some commands in paralel, in background, using bash. Here's what I'm trying to do:

forloop {
  //this part is actually written in perl
  //call command sequence
  print `touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;`;
}

The part between backticks (``) spawns a new shell and executes the commands in succession. The thing is, control to the original program returns only after the last command has been executed. I would like to execute the whole statement in background (I'm not expecting any output/return values) and I would like the loop to continue running.

The calling program (the one that has the loop) would not end until all the spawned shells finish.

I could use threads in perl to spawn different threads which call different shells, but it seems an overkill...

Can I start a shell, give it a set of commands and tell it to go to the background?

 47  94485  47
1 Jan 1970

Solution

 32

I haven't tested this but how about

print `(touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;) &`;

The parentheses mean execute in a subshell but that shouldn't hurt.

2008-10-02

Solution

 23

Another way is to use the following syntax:

{ command1; command2; command3; } &
wait

Note that the & goes at the end of the command group, not after each command. The semicolon after the final command is necessary, as are the space after the first bracket and before the final bracket. The wait at the end ensures that the parent process is not killed before the spawned child process (the command group) ends.

You can also do fancy stuff like redirecting stderr and stdout:

{ command1; command2; command3; } 2>&2 1>&1 &

Your example would look like:

forloop() {
    { touch .file1.lock; cp bigfile1 /destination; rm .file1.lock; } &
}
# ... do some other concurrent stuff
wait # wait for childs to end
2013-09-26