brianlange Posted August 30, 2010 Share Posted August 30, 2010 I am moving servers and am using scp to move the files. Is this the best approach or is there something else I should be using when moving a lot of files from one server to another? Quote Link to comment Share on other sites More sharing options...
steviewdr Posted August 30, 2010 Share Posted August 30, 2010 rsync would be better to transfer a large number of files from one server to another. For example, you can run rsync again, and it will check if any files have been modified (since you started the initial copy) and copy them to the new server. There are quite a few switches you can use with rsync. I typically use the following on the new server: rsync -avz user@server:source destination e.g. rsync -avz user@oldserver:/home/user /home/ -steve Quote Link to comment Share on other sites More sharing options...
trq Posted August 30, 2010 Share Posted August 30, 2010 scp should be fine but make sure you use compression to speed things up. eg; tar czf - sourcedir/ | ssh foo@foo.com tar xzf - -C destdir Quote Link to comment Share on other sites More sharing options...
brianlange Posted August 30, 2010 Author Share Posted August 30, 2010 Thanks. I am using scp and did compress the files. I'll give rsync a try as well. Quote Link to comment Share on other sites More sharing options...
Hypnos Posted September 7, 2010 Share Posted September 7, 2010 If you're doing the same copy over and over again (like a backup), absolutely use rsync. Quote Link to comment Share on other sites More sharing options...
markylove Posted September 8, 2010 Share Posted September 8, 2010 I have about 6 TB of files I need to move from one server to another. I did my best to move it over FTP like below, but the connection dies a lot, and after a certain amount of progress, it disconnects before it even resumes moving files from I presume taking too long to compare files before actually transferring and then timing out. ~/ncftp-3.2.3/bin/ncftpput -R -z -v -u "user" -p "password" upload.server.net /local/dir/ remote/dir/ I've used rsync. if connection drops it've compare source and destination and sync from where it left (assuming large amount of small to medium files, not 2 x 3 TB ). alternatively start apache and make your file dir root and do recursive wget, might work as well, you just need to tell it to ignore files that already exist locally. Quote Link to comment Share on other sites More sharing options...
steviewdr Posted September 12, 2010 Share Posted September 12, 2010 mark: rsync has --inplace and --partial switches which can pick up transferring large files from where it left off. -steve Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.