Jump to content

How to compress multiple directories in webspace


biocyberman

Recommended Posts

Hello there!

 

I have been searching many hours, have not found anything applicable, so I would like to ask some help. I have a hosting space with thousands of small files in multiple directories. Now I want to migrate them to new host. Obliviously compressing all those files and directories before downloading and uploading to new server can really save a lot of time and troubles (i.e file attributes). Unfortunately I don't have ssh access so I can't use shell command to compress file before download; old hosting provider wouldn't want to provide help.  I tried using perl script and several backup classes from phpclasses but they didn't work. Old server is Freedsb, PHP 4.4.1. Could you show me how to do what I describe above?

Link to comment
Share on other sites

Sorry, don't know of any 3rd party PHP scripts that will do that... surely someone has made one though.

 

What type of control panel is installed?

 

Really though, how much total disk space are we talking about for the entire website? I've been in the position many times to have to archive a site through FTP, and even in the > 1gb range, this goes relatively fast (under 20 mins). So, by the time you do all this research, you could just have the site downloaded to your local machine.

 

PhREEEk

Link to comment
Share on other sites

Hi PHREEK,

Thanks for your reply. Total size is about 2 Gb. My ftp connection is slow and my computer is Windows XP, so when I copy to my computer before uploading, file attributes are all lost. That's the reason I want to go for compressing before download. I may use this method many times in future and for the sake of learning, I want to know it.

 

 

Link to comment
Share on other sites

Thanks all for replies.

I tried with following code, browser (Firefox) showed infinite running or popped up a saving file dialogue told me if I want to save "backup.php" file.

<?php
// Script name: backup.php
ini_set("max_execution_time", "300");
echo exec('which tar');  //Just to make sure correct tar path
exec("/bin/tar czvf ./backup.tar.gz home/username/public_html/src_dir");
// I know src_dir exists
echo "<br /> Backup finished";
?>

I don't know what is wrong ???

Link to comment
Share on other sites

Your paths don't look right.

 

exec("/bin/tar czvf backup.tar.gz /home/username/public_html/src_dir");

 

Ive just read your second reply though. Compressing 2G worth of files really is going to be a nightmare. Have you looked at the ftp extension at all. It probably best to simply ftp the files to your new server.

Link to comment
Share on other sites

Hello,

With ssh access to my new web space I managed to do transfer with wget command at very high speed. I also got advice on how to use perl to work for what I want (http://perlguru.com/gforum.cgi?post=29170). I don't know why the same shell command works when put in perl, but doesn't work when put in PHP exec() though. Maybe because of permission setting.

 

 

Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.