Running jobs in the background

With Input/Ouput Redirection you can write your own analysis pipelines. However, these pipelines will run in the foreground in the shell. So how can you start an analysis, log out of your shell, and walk away?

Here's how:
cat sequences.fastq | python script.py | velvet
## This mock pipeline would pipe your fastq file into some python script and the output
## of the script into velvet. Unfortunately it runs in the foreground and
## you won't be able to log out until this process is complete.
cat sequences.fastq | python script.py | velvet &
## This does the same thing as the pipe above but in the background. Now
## you can log out, walk away, and pick your results up later.

More on job control...

You may have a script1 that creates a bunch of files that will serve as the input for script2. So you want to run script1 first and once that script is done and reports that it exited successfully you want to run script2
using the files generated by script1 as input. You can connect both scripts (or programs) using && to accomplish this task.
script1 foo.input output.txt && script2 output.txt finalout.txt &
## Run fake script1 using foo.txt as its input and producing output.txt as
## its output. Once script1 finishes and reports that it exited cleanly
## script2 is started with output.txt as input. All this runs in the
## background and you can come back later to pick up your results.
## Add some logging function from the I/O Redirection page and voila!
script1 foo.input output1.txt ; script2 foo.input output2.txt &
## Here script2 will start regardless of whether or not script1 exited
## without error. Useful for batch-jobs where you just want to get a lot
## of data crunching done and come back later to fix issues with
## individual datasets.