Steve and Casey's book "Practical Computing for Biologists" has a whole chapter dedicated to working on remote servers. Below are some useful/essential things.

SSH - secure shell | SCP - secure copy | WGET - downloading files from ftp and http directly from the shell/terminal | Dropbox - automatically synchronizing files across the web | Mounting a remote file system to your local machine

SSH - secure shell

First of all we need to get to the remote server. For this we'll generally use secure shell (ssh), which is installed on pretty much all Unix and Unix-like systems. SSH let's you establish an encrypted connection to a remote machine through which you can securely transmit data. Even though there is the option to use password log-ins with ssh in most cases you'll need to use a keyfile for authentication. Here's an explanation of public-private key authentication that explains the basics of these keys.

Once you set up your key you can use ssh as follows:
ssh -i /path/to/keyfile -v user@server
Here the flag -i specifies where your shared key can be found (there are more elegant solutions to letting ssh know where your keys are, but the -i flag is the quickest and easiest way to make sure you use the right key for the right server). -v is a useful flag that turns on the verbose mode. Even though you don't have to set it, it'll allow you trouble-shooting your connection in case you cannot connect. The rest (user@server) is simply your authorized user name at the remote machine and the IP, public DNS, or other webaddress of the server you want to connect to.

There are many more possibilities of what can be done with ssh and the example above is the most basic usage for ssh.

SCP - secure copy

scp allows you to copy data from and to remote servers to your local computer. It uses the same authentication as ssh and the ssh protocol. So your data gets encrypted and safely transmitted through the network. It's operation is very similar to ssh.

Here are three examples of things that you may want to do frequently:

1) Copy a file from your local computer to some path on a remote computer.
scp -i /path/to/keyfile /some/path/to.file user@server:/path/
2) Copy a file from a remote server to your local computer.
scp -i /path/to/keyfile user@server:/some/path/to.file /path/
3) Copy a whole directory and its contents from your local computer to the remote machine.
scp -i /path/to/keyfile -r /some/path/with/files/in/it user@server:/path/
In the last example the -r flag stands for recursion which means that scp (in the same manner as cp) will recurse into te directory and copy all subfolders and files contained in the folder. To get a folder from the remote machine to your local computer just use the syntax of example 2 and add the -r flag.

Rsync has many advantages over scp for copying files across networks. In particular, rsync will pick up your file transfer where you left off should the network connection get disrupted; scp does not do this! Rsync's real power lies in enabling synchronization of directories across networks for recurring backups. For now read the manual on rsync and check the web for your particular need. Beware that scp does not seem to care much about whether or not there is a trailing / in teh command. Rsync does create the directory in question when the path is followed by / but doesn't create it when you omit the /.

WGET - downloading files from ftp and http directly from the shell/terminal

Wget allows you to download files and folders from ftp and http sites from the terminal. This is particularly useful when you are working on a remote server and don't have access to any of your GUI tools.

You can simply download a file from an anonymous ftp site or http site by typing the following:
wget ftp://server/path/to.file
 
wget http://server/path/to.file
In case an ftp site requires you to log in using user and password:
wget ftp://user:password@server/path/to.file
If you're after a whole directory from an ftp site you have two options. The first one is to use ftp -r which sets wget to use recursion similar to scp and cp. The recursions depths is only 5 directories deep, however. If the directory tree you're trying to download has more than 5 nodes you could use the -m (mirror) flag.
wget -r ftp://user:password@server/some/path/
 
wget -m ftp://user:password@server/some/path/

Some websites like SourceForge redirect your request (e.g., in a browser a new tab or window with your download will pop up). In such cases wget will not be able to download your file. Curl, however, can if you tell it to follow the link with the -O flag. Note that curl is not part of the Bourne Again Shell (Bash, which is the default shell on most systems these days). So you may have to install it if you need it.
curl -O http:/server/path/to/file

Dropbox - automatically synchronizing files across the web

With Dropbox you can automatically synchronize files and directories over the web. The Dropbox daemon can be run on a server that doesn't have a GUI. This is great news, since you can run Dropbox on a remote machine and dump files into your Dropbox folder on the remote machine. These files will then be uploaded to the Dropbox website and wind up on your local machine if you have Dropbox running on that machine. The same works the other way around making up- and downloading of files to and from a remote machine a piece of cake. The free Dropbox account gives you 2GB of storage.

Download and extract Dropbox on the remote machine (you can copy and paste the following code):
wget -O dropbox.tar.gz "http://www.dropbox.com/download/?plat=lnx.x86_64"
 
tar -xvzf dropbox.tar.gz
Now start Dropbox:
~/.dropbox-dist/dropboxd &
You should see the following message:
This client is not linked to any account...
Please visit https://www.dropbox.com/cli_link?host_id=XXXXX to link this machine.
 
Copy the web address in the message and paste it into a browser. This will direct you to Dropbox; sign in. Now your remote server is linked to your Dropbox account and a Dropbox folder should show up in the home directory of the remote machine. Files that you drop into the Dropbox folder on both local and remote machines will now be synchronized.

Mounting a remote file system to your local machine

This is a bit more advanced and assumes you have sshfs installed. If you get the hang of this though transferring files back and forth from a sever is as easy as drag and drop in your file browser! You will also need to have the same user on your local machine as on the server for this to work smoothly. I use root when I work on Amazon's cloud; so I will mount the remote file-system from the cloud as root on my local machine. Think about how you want to handle users to avoid file-permission issues (i.e., files and folders created by different users cannot be accessed by one or the other unless you use root, change ownership of files and folders etc.).

To mount a remote folder to a local one create a folder on your local machine (e.g., your laptop) first.
mkdir /server
Now you can mount the directory you want to work with to /cloud.
sshfs -o IdentityFile=/path/to/ssh/keyfile root@server:/mnt /server
Now the contents of /mnt on your server should appear in a folder called /server on your local machine. Everything that you move or copy into this directory on your local machine will get synchronized with the folder on the server. IMPORTANT: Use this only when you have a reasonably fast and reliable network connection! If you loose connectivity while you are accessing a file it may get corrupted on both ends.

To unmount issue
fusermount -u /server
This will clean up and unmount the remote file system.

You can use your regular file-manager to drag and drop files into the /server folder. If you use root as I do make sure to open your file-manager as root. On a Linux box it's a s easy as opening a shell, changing to root user and issuing the command that opens your file-manager. If you use thunar just type thunar &. For Ubuntu users nautilus & should give you a file-manager window.