Chances are you have a bunch of different hosts that are housing your website files, for the sake of data safety (never put all eggs in a single basket) and possibly some SEO advantage. If that is the case, you will infrequently come to the need to move some files from one host server to another. How does one do that?
Well the straight answers include downloading the files from the source host and then uploading it to destination one via FTP. It’s not much of a time-waster with small number of files, especially those small in size. However, if it’s an impressively large chunk of package, say, 4GB, or thousands of files, this’d be quite a daunting job that may very well take the better part of your day or even a few days.
The shortcut is to transfer those files directly from the original host to the other, via SSH. That is of course, if you have both hosts enabled with SSH.
scp Command
Log into the destination host via SSH and try the following command:
scp -r [email protected]:/home/remoteuser/dir-to-be-transferred/. /home/localuser/backup
Wherein remote.host.com is the address of the source host and remoteuser is the SSH user (shell user) account that can read the remote directory to be transferred, namely /home/remoteuser/dir-to-be-transferred. The last argument is the local path that’s receiving the incoming files / directory.
The dot at the end of dir-to-be-transferred makes sure that all hidden files such as .htaccess are copied as well. Without the current directory sign (dot), hidden files are NOT copied by default.
You can also transfer a specific file:
scp [email protected]:/home/remoteuser/mybackup.tar.gz /home/localuser/backup
As a matter of fact, scp works the exactly same way as an ordinary cp command except it’s able to copy files back and forth remote hosts. The “s” of “scp” stands for safe, because all the data transferred is encrypted on SSH.
It’s a great way to back up your valuable website data across multiple different hosts that are physically far away from each other. With the help of crontab jobs that do the regular backups automatically, this is even better than some of the commercial backup services.
rsync Command
The command of rsync is a more preferable option to scp for synchronizing stuff across different hosts because it compares differences and works incrementally, thus saving bandwidth, especially with large backups. For examples,
rsync -av --progress [email protected]:/home/remoteuser/dir-to-be-transferred /home/localuser/backup
This would copy and transfer the directory dir-to-be-transferred with all its content into backup so that dir-to-be-transferred is a sub-directory of backup.
rsync -av --progress [email protected]:/home/remoteuser/dir-to-be-transferred/. /home/localuser/backup
With an extra /. at the end of the source directory, only the content of the directory dir-to-be-transferred are copied and transferred into backup. Thus all the content of the directory dir-to-be-transferred are now immediate children of backup.
To make the transfer of a very large file resume-able, use the -P switch which automatically includes –progress:
rsync -avP [email protected]:/home/remoteuser/large-file.ext /home/localuser/backup
So when the transfer is interrupted, run the same command again and rsync would automatically continue at the break point.
To specify the SSH port, such as 8023, just add:
--rsh='ssh -p8023'
rsync automatically takes care of all hidden files, so there’s no need to add a dot at the end of the source directory.
To exclude a specific directory from being synchronized:
--exclude 'not/being/transferred'
Put long running rsync command in background
When you press “ctrl + z” then the process stopped and go to background.
[1]+ Stopped rsync -ar –partial /home/webup/ /mnt/backup/
Now press “bg” and will start in background the previous process you stopped.
[1]+ rsync -ar –partial /home/webup/ /mnt/backup/ &
Press “jobs” to see the process is running
[1]+ Running rsync -ar –partial /home/webup/ /mnt/backup/ &
If you want to to go in foreground press “fg 1” 1 is the process number
I believe you need a -r flag if you want to copy an entire directory. Otherwise it thinks its a file and gives you an error.
Thanks for the tip, just added it. 🙂