Jump to content

rckehoe

Members
  • Posts

    34
  • Joined

  • Last visited

    Never

Profile Information

  • Gender
    Not Telling

rckehoe's Achievements

Member

Member (2/5)

0

Reputation

  1. i appreciate the suggestion... but i am using set_time_limit(0); and ini_set('memory_limit', '-1'); already.... i should have mentioned that. my script works fantastic for long periods of time and downloads large files... problem is, i need to download even larger files. hoping for files up in the 1Gb range at least. i tried your code to get a portion of the file and it did not work... i stll got the same error. any thoughts?
  2. I have a script that I am trying to transfer one file from a remote server to my local server and save it. The files can be quite large. I have successfully backed up around 200 MB... But, anything over that seems to fail with a strange error... I was hoping for a little guidance... here is my code and error/warning that I get: PHP Code: function remote_capture($tmp_url, $filename) { $r_handle = fopen($tmp_url, "rb"); $d_handle = fopen($filename, 'w'); if($r_handle&&$d_handle) { while(($buffer = fgets($r_handle)) !== false) { fputs($d_handle, $buffer); } fclose($r_handle); fclose($d_handle); return true; } else { return false; } } Warning Message: Warning: file_get_contents(URL_OF_MY_SCRIPT_HAS_BEEN_REMOTED) [function.file-get-contents]: failed to open stream: HTTP request failed! in /path/on/my/local/server/to/script on line 298
  3. I apprecaite the suggestion, however... The reason I developed a PHP script in the first place is because I don't have access to the shell...
  4. I don't really have an issue with my script... I simply want to know if it is a bad practice to create a script that recurses through your root directory and zips up everything into one large zip file... I am creating a backup deal for my site and my site is rather large... Is this considered bad practice?
  5. I realize what a 404 means... Question is, why do I get this after 30 minutes of processing a script? and I am not really using a browser.. I am running the script via the command line using WGET and I even tried CURL. Both return a 404 error... Thing is... the page is found just fine, the script stops executing and it gives me that error... Very strange... -- There is no particular reason I am using PHP to do this... Just wanted to. These scripts should technically function the same way anything else does, I don't see why it cannot...
  6. I have 2 scripts that take a while to process... The first takes a few minutes, the second takes over an hour... The first script completed successfully, but the second script seems like it just stops most of the way through the script... I have this set on both files: set_time_limit(0); ini_set('memory_limit', -1); I NEED to disable ALL time limits and all of that, does ANYONE have any clue as to why my second script is failing? It is NOT the programming, I know that for sure... But the script just stops executing, and I think it is because it times out... But after setting those 2 options, I don't see why it times out.. Also for reference, I am NOT using any database... I am simply downloading files from an FTP server to my server... Any help would be appreciated, I have been searching for the last few days on an answer. Rob
  7. I just have a question, not so much a problem... If I am using PHP with FTP compiled, and I am using this to download a large website with images, BIG files, etc... Will this eventually timeout because of PHP limits? If so, what if I use cron to execute? Does this work? I am trying to build a simple backup program for my website, and I want to initiate a backup every so often.. and I dont' want to start building this if it is not going to work... If this is not going to work, any advice or help is appreciated! Thanks! Rob
  8. Well sock me sideways, it actually worked!!! I cannot tell you how much you have helped me out! THANK YOU SO MUCH!
  9. I am not sure if this will help or not, but you may want to try something like this: WHERE school.region Like \"$region\" AND course_type.type LIKE \"$coursetype\" ";
  10. I am going to loose my mind, I am running into an issue that has got me stumped... This is suppose to be a very easy thing to do, but for some reason cannot figure it out... Any help is appreciated. I have got small table in my database called "vendors"... Inside this table I only have about 2-3 records... Each of these records contains a latitude and longitude. Fields are called "lat" and "lon" I have a VERY simple mysql query like this: SELECT * FROM `vendors` WHERE (`lon` BETWEEN "-76.3066" AND "-133.5176") I have one specific record in the database that has a 'lon' of -104.884082.... In theory it should be pulling this record, but for some reason NO results are generated.... Have I gone off the deep end or something? I have tried everything I can possibly thing of with no luck!!! Someone please help me!
  11. It does NOT call the mysql_query 10 times... It just gets the value that you passed to it.
  12. I have a script, that works just fine... It executes the php file that I want it to... I just have a simple question that I cannot find an answer for anywhere! How do I access the variables that I pass through the exec function? My code is below... When the fork_blast.php page is executed, how do I get the variable there? I want to pass a user ID to this page, because it is going to perform a lengthy task... how do I access the $UserID $PATH=$_SERVER['DOCUMENT_ROOT']; exec ("/usr/bin/php ".$PATH."/fork_blast.php \"".$UserID."\" >/dev/null &");
  13. curl cannot resize an image, can it? The whole point of what I am doing is to be able to resize the image, then save it on the server, so I don't have to download the image, then resize, then re-save it back to the server and delete the old one... I just think that is way to many steps and will take longer to execute the code.
  14. I have just a question.... 1) Can the GD Image library get an image from a given URL and then resize it, then save it to my local server? The main question would be, does it support the URL passing, or does the image have to be local?
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.