inversesoft123 Posted September 17, 2010 Share Posted September 17, 2010 Hello, If we have large amount of data we always use SCP to copy data from remote server. scp -P 100 user@192.168.2.3: /home/user/folder /home/user1/folder/. But if DSL (Internet) of local computer disconnects I observed failure of transfer many times and this whole process becomes irritating. Is there any alternative way to use scp or rsync with php so that we can execute it with script ? Something like this ? <?php // Having shell access but its not working 4 me... $conn = ssh2_connect('ftp.server.com', 100); ssh2_auth_password($conn 'user', 'pass'); ssh2_scp_send($conn, '/local/filename', '/remote/filename', 0644); ?> Another question is there any way we can run such code with the help of shell_exec() or is this implementation is secure? Thanks in Advance! Quote Link to comment Share on other sites More sharing options...
roopurt18 Posted September 17, 2010 Share Posted September 17, 2010 My experience with shuffling files to remote locations is to: 1) Split the file into smaller chunks; I typically use ~2MB 2) For each chunk i) Send chunk, on failure sleep for a timeout and try again a) If excessive failures kill process and shoot out a notice e-mail Now if the server they're sent to is the final destination, just assemble the chunks on the remote server. However if the remote server is a holding area and the file is going to be downloaded elsewhere, leave it in pieces and essentially repeat the process you used to put the file there, except download instead of upload. Anything long-lived over a network connection has a chance to fail so its almost always best to divide and conquer. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.