Jump to content

biocyberman

New Members
  • Posts

    7
  • Joined

  • Last visited

    Never

Everything posted by biocyberman

  1. Hello, With ssh access to my new web space I managed to do transfer with wget command at very high speed. I also got advice on how to use perl to work for what I want (http://perlguru.com/gforum.cgi?post=29170). I don't know why the same shell command works when put in perl, but doesn't work when put in PHP exec() though. Maybe because of permission setting.
  2. Thanks thorpe, I tweaked around with paths but it didn't work. ftp extension would be a good try. I will come back on any updates.
  3. @matstuff I would say you pick up one of the followings: 1. Ask university administrator for permission to install web server software (i.e. Apache) 2. By a webspace from hosting provider (about 100 USD/year can solve the problem perfectly). 3. Try searching if you can run Appache without installation (I tried googling but nothing good showed up)
  4. Thanks all for replies. I tried with following code, browser (Firefox) showed infinite running or popped up a saving file dialogue told me if I want to save "backup.php" file. <?php // Script name: backup.php ini_set("max_execution_time", "300"); echo exec('which tar'); //Just to make sure correct tar path exec("/bin/tar czvf ./backup.tar.gz home/username/public_html/src_dir"); // I know src_dir exists echo "<br /> Backup finished"; ?> I don't know what is wrong ???
  5. No I don't have SSH access. That's the problem. Thanks for your reply anyway.
  6. Hi PHREEK, Thanks for your reply. Total size is about 2 Gb. My ftp connection is slow and my computer is Windows XP, so when I copy to my computer before uploading, file attributes are all lost. That's the reason I want to go for compressing before download. I may use this method many times in future and for the sake of learning, I want to know it.
  7. Hello there! I have been searching many hours, have not found anything applicable, so I would like to ask some help. I have a hosting space with thousands of small files in multiple directories. Now I want to migrate them to new host. Obliviously compressing all those files and directories before downloading and uploading to new server can really save a lot of time and troubles (i.e file attributes). Unfortunately I don't have ssh access so I can't use shell command to compress file before download; old hosting provider wouldn't want to provide help. I tried using perl script and several backup classes from phpclasses but they didn't work. Old server is Freedsb, PHP 4.4.1. Could you show me how to do what I describe above?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.