Jump to content

Request handling in my uploading / trans-loading script


amirf3131

Recommended Posts

i want to make a multiple transloading script from any url to my hosting account. its like server to server transloading script. 

 

Problem is i have a list of more then 300K image urls or zip files when i insert all of this script stop may be because of too many request submission at a time, any way to handle request submission ?

 

here is my script.

<?php

// Check if form has been submitted
if(@$_POST['submit']){
ini_set("max_execution_time", 0);  // no time-outs!
ignore_user_abort(true);   // Continue downloading even after user closes the browser.
// URLS -- One on each line
$URL = $_POST['url'];
// Relative path to Save downloaded images
// Default is "downloads"
// Make sure that it is writable (chmoded correctly)
$folder = $_POST['folder']; 
// Check if User has provided a local folder
if (!$folder || !isset($folder)){
// Generate error if left blank by user.
die ("Please specify local folder name");
}
// Split all URLS into an array
$urls = split("\n", $URL);
// Remove Carriage Returns (useful for Windows-based browsers)
$urls = str_replace("\r", "", $urls);
$mh = curl_multi_init();
foreach ($urls as $i => $url) {
$path = pathinfo($url);
$g=$folder . "/" . $path["basename"] ;
// Check if file already exists on local folder.
if(file_exists($g)){
// If exists, delete the file so it always contains the latest update. 
unlink($g) or die("Unable to delete existing '$g'!");
}
// Update the user of what's going on
echo "$i) Downloading: from <b>$url</b> to <a href=\"$g\"><b>$g</b></a><br />";
if(!is_file($g)){
$conn[$i]=curl_init($url);
$fp[$i]=fopen ($g, "w");
curl_setopt ($conn[$i], CURLOPT_FILE, $fp[$i]);
curl_setopt ($conn[$i], CURLOPT_HEADER ,0);
// curl_setopt($conn[$i],CURLOPT_CONNECTTIMEOUT,1000);
curl_multi_add_handle ($mh,$conn[$i]);
}
}
do {
$n=curl_multi_exec($mh,$active);
}
while ($active);
foreach ($urls as $i => $url) {
curl_multi_remove_handle($mh,$conn[$i]);
curl_close($conn[$i]);
fclose ($fp[$i]);
}
curl_multi_close($mh);
} // task closed
?>
<br />
<br />
<fieldset>
<legend>
<label for="url">Server to Server Upload Script</label>
</legend>
<form method=POST>
<label for="url">Insert Files URL, One Per Line: </label><br />
<textarea rows=15 cols=75 id="url" name="url"><?= $URL ?></textarea><br />
<label for="folder">Folder Name: </label><input type=text id="folder" name="folder" value="uploads"/>
<input type=submit name="submit" value="Start Uploading Files!" />
</form>
</fieldset>

request handling code founded online

function rolling_curl($urls, $callback, $custom_options = null) {

    // make sure the rolling window isn't greater than the # of urls
    $rolling_window = 5;
    $rolling_window = (sizeof($urls) < $rolling_window) ? sizeof($urls) : $rolling_window;

    $master = curl_multi_init();
    $curl_arr = array();

    // add additional curl options here
    $std_options = array(CURLOPT_RETURNTRANSFER => true,
    CURLOPT_FOLLOWLOCATION => true,
    CURLOPT_MAXREDIRS => 5);
    $options = ($custom_options) ? ($std_options + $custom_options) : $std_options;

    // start the first batch of requests
    for ($i = 0; $i < $rolling_window; $i++) {
        $ch = curl_init();
        $options[CURLOPT_URL] = $urls[$i];
        curl_setopt_array($ch,$options);
        curl_multi_add_handle($master, $ch);
    }

    do {
        while(($execrun = curl_multi_exec($master, $running)) == CURLM_CALL_MULTI_PERFORM);
        if($execrun != CURLM_OK)
            break;
        // a request was just completed -- find out which one
        while($done = curl_multi_info_read($master)) {
            $info = curl_getinfo($done['handle']);
            if ($info['http_code'] == 200)  {
                $output = curl_multi_getcontent($done['handle']);

                // request successful.  process output using the callback function.
                $callback($output);

                // start a new request (it's important to do this before removing the old one)
                $ch = curl_init();
                $options[CURLOPT_URL] = $urls[$i++];  // increment i
                curl_setopt_array($ch,$options);
                curl_multi_add_handle($master, $ch);

                // remove the curl handle that just completed
                curl_multi_remove_handle($master, $done['handle']);
            } else {
                // request failed.  add error handling.
            }
        }
    } while ($running);
    
    curl_multi_close($master);
    return true;
}

Another code for handling request founded online

<?php
function getHead($urls){
    $results = array();
    // make sure the rolling window isn't greater than the # of urls
    $rolling_window = 5;
    $rolling_window = (sizeof($urls) < $rolling_window) ? sizeof($urls) : $rolling_window;
    $master = curl_multi_init();
    // $curl_arr = array();
    // add additional curl options here
    $options = array(
      CURLOPT_FOLLOWLOCATION => FALSE,
      CURLOPT_RETURNTRANSFER => TRUE,
      CURLOPT_NOBODY => TRUE,
    );
    // start the first batch of requests
    for ($i = 0; $i < $rolling_window; $i++) {
      $ch = curl_init();
      $options[CURLOPT_URL] = array_pop($urls);
      curl_setopt_array($ch, $options);
      curl_multi_add_handle($master, $ch);
    }
    do {
      while (($execrun = curl_multi_exec($master, $running)) == CURLM_CALL_MULTI_PERFORM) {
        ;
      }
      if ($execrun != CURLM_OK) {
        break;
      }
      // a request was just completed -- find out which one
      while ($done = curl_multi_info_read($master)) {
        $info = curl_getinfo($done['handle']);
        $results[$info['url']] = $info;
        $new_url = array_pop($urls);
        if(isset($new_url)){
          $ch = curl_init();
          $options[CURLOPT_URL] = $new_url;
          curl_setopt_array($ch, $options);
          curl_multi_add_handle($master, $ch);
        }
        // remove the curl handle that just completed
        curl_multi_remove_handle($master, $done['handle']);
      }
    } while ($running);
    curl_multi_close($master);
    return $results;
  }

I found online code of request controlling may be this one is helpful above are the code and i don't know how do i combine to work with my script.?

Any help to run unlimited files so i sit back and relax while files going to be transfer. ?? any Que system in requesting ?

Link to comment
Share on other sites

First off, you definitely don't want a script which allows anybody to put arbitrary files into arbitrary locations on your server. What if somebody uploads malware and uses it to attack the server? What if your server is abused to host illegal material? You can't have that.

Secondly, you don't transfer 300,000 images in one go. Network operations can fail. PHP scripts can fail. If anything goes wrong, good luck figuring out which of those 300,000 images are missing or broken.

This needs to be done intelligently:

  • Implement an authentication mechanism (e. g. a a strong password) so that only you can use the script.
  • Only store files in a special download folder, do not let the user specify arbitrary paths. You'll also need a mechanism to prevent duplicate names. If you take the basenames of 300,000 URLs, you're almost guaranteed to end up with collisions.
  • The initial request should only store the list of URLs on the server, ideally in a database. Then the server can split the work into realistic chunks and periodically download a set of images. If a particular download failed, the URL is simply kept on the list and tried again next time. If the script fails, it will be restarted next time. This is a lot more robust than just hoping everything will work on the first try.
Link to comment
Share on other sites

 

First off, you definitely don't want a script which allows anybody to put arbitrary files into arbitrary locations on your server. What if somebody uploads malware and uses it to attack the server? What if your server is abused to host illegal material? You can't have that.

 

Secondly, you don't transfer 300,000 images in one go. Network operations can fail. PHP scripts can fail. If anything goes wrong, good luck figuring out which of those 300,000 images are missing or broken.

 

This needs to be done intelligently:

  • Implement an authentication mechanism (e. g. a a strong password) so that only you can use the script.
  • Only store files in a special download folder, do not let the user specify arbitrary paths. You'll also need a mechanism to prevent duplicate names. If you take the basenames of 300,000 URLs, you're almost guaranteed to end up with collisions.
  • The initial request should only store the list of URLs on the server, ideally in a database. Then the server can split the work into realistic chunks and periodically download a set of images. If a particular download failed, the URL is simply kept on the list and tried again next time. If the script fails, it will be restarted next time. This is a lot more robust than just hoping everything will work on the first try.

 

 

 

thnx for every thing any person recommend for hiring this work ??

Edited by amirf3131
Link to comment
Share on other sites

I am available. The site wont let me PM you.

 

want to hire you for this work or make donation, can you please give me your skype id ???just for more details discus. I leave a comment on your  galaxyinternet.us site.

Edited by amirf3131
Link to comment
Share on other sites

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.