Running+Long+Jobs

**Running jobs in the background**
With Input/Ouput Redirection you can write your own analysis pipelines. However, these pipelines will run in the foreground in the shell. So how can you start an analysis, log out of your shell, and walk away?

Here's how: code format="bash" cat sequences.fastq | python script.py | velvet cat sequences.fastq | python script.py | velvet & code
 * 1) This mock pipeline would pipe your fastq file into some python script and the output
 * 2) of the script into velvet. Unfortunately it runs in the foreground and
 * 3) you won't be able to log out until this process is complete.
 * 1) This does the same thing as the pipe above but in the background. Now
 * 2) you can log out, walk away, and pick your results up later.

**More on job control...**
You may have a script1 that creates a bunch of files that will serve as the input for script2. So you want to run script1 first and once that script is done and reports that it exited successfully you want to run script2 using the files generated by script1 as input. You can connect both scripts (or programs) using && to accomplish this task. code format="bash" script1 foo.input output.txt && script2 output.txt finalout.txt & script1 foo.input output1.txt ; script2 foo.input output2.txt & code
 * 1) Run fake script1 using foo.txt as its input and producing output.txt as
 * 2) its output. Once script1 finishes and reports that it exited cleanly
 * 3) script2 is started with output.txt as input. All this runs in the
 * 4) background and you can come back later to pick up your results.
 * 5) Add some logging function from the I/O Redirection page and voila!
 * 1) Here script2 will start regardless of whether or not script1 exited
 * 2) without error. Useful for batch-jobs where you just want to get a lot
 * 3) of data crunching done and come back later to fix issues with
 * 4) individual datasets.