Jump to content

tiganulmeu

New Members
  • Posts

    7
  • Joined

  • Last visited

    Never

Profile Information

  • Gender
    Not Telling

tiganulmeu's Achievements

Newbie

Newbie (1/5)

0

Reputation

  1. I have a file like : $file_to_download = "http://www.mywebsite.com/file.txt"; however the file`s size is changed sometimes to a small size like 30kb and sometimes to a large size like 1Gb, i don`t know what to do to get this file only when it`s small, i used cURL with time-out but it doesn`t work, is there something that can check if the download takes more than .... 4-5 seconds then disconnect and return false ? .... Please heeelp
  2. i think that your code trecool999 is still a time consuming script as it still takes each variable in a.txt file to search in b.txt file, i found an array function and i think that it resolves my problem, i only tried it with smaller a.txt and b.txt files, please tell me if this will be ok with large files too ... before i try implementing it in my scripts.... : $a_file_array = explode("\n", file_get_contents("a.txt")); $b_file_array = explode("\n", file_get_contents("b.txt")); print_r(array_diff($a_file_array, $b_file_array)); Thank you verry much for your help.
  3. Hy guys, ok, i have two files, a.txt and b.txt, both files contains numbers like : 6541664\n 1598455\n 4215324\n 1251254\n and so on, through "\n" i want to specify that these numbers are separated with new lines but i`m sure you already know that. The problem consists in the file size, i want to remove the lines that already exists in file b.txt from a.txt however both files are verry large, i`m talking about at least 10Mb zise each. I did use a function like: function remove_rez($key, $fisier) { $fc=file($fisier); $f=fopen($fisier,"w"); foreach($fc as $line) { if (!strstr($line,$key)) { fputs($f,$line); } } fclose($f); } $b_file_contents = explode("\n", get_file_contents("b.txt")); $a_file =get_file_contents("a.txt"); foreach(array_unique($b_file_contents) as $b_file_line) { if(strpos($b_file_line, $a_file) === TRUE) { remove_rez($b_file_line, "a.txt"); } } but with laaarge files and lots of repeating values it takes a verrrrrry long time and it even blocks my pc ...... a faster and better solution is greatly appreciated. Thanks in advance !!!
  4. I`m already using the explorer version, to launch them in separate windows but it`s more complicated .. and i need to "upgrade" ... maybe we can fix this with some curl option or timer, this is the function i`m using: function curl($url, $cookie='', $post='') { $ch = curl_init(); curl_setopt($ch, CURLOPT_URL, $url); curl_setopt ($ch, CURLOPT_REFERER, $url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); if($post !== '') { curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $post); } if($cookie !== '') { curl_setopt($ch, CURLOPT_COOKIEFILE, $cookie); curl_setopt($ch, CURLOPT_COOKIEJAR, $cookie); } curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE); curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); $result = curl_exec ($ch); curl_close ($ch); if($result == "") { curl($url, $cookie, $post); } else { return $result; } } EOF What do you think, can i somehow disconnect after a period of time ?...
  5. OK, was a good ideea but running a background process won`t be posible as i don`t have command line access, is there anything else i can do ? like a function or class. Or maybe if i can discconect the download right after i start the curl to somepage.php as i used ignore_user_abort on the somepage1.php and i don`t need really need the download to complete ...... New ideeas ?
  6. i don`t understand what "fork" the scripts means, also the download pages takes 5-6 hours to download ... Any ideea ?
  7. Hy there, ok, i`ll get to the problem: My php script has a function named curl(some curl options and instructions), it doesn`t matter that, what it matters: $var1 = curl("http://www.somedomain.com/somepage1.php"); $var2 = curl("http://www.somedomain.com/somepage2.php"); $var3 = curl("http://www.somedomain.com/somepage3.php"); $var4 = curl("http://www.somedomain.com/somepage4.php"); Ok, so it downloades the content of somepage_.php to the $vars, however somepage_s_.php contains several scripts which takes a verry long time for the page to download, so my script waits for each page to download, one by one, as line by line, what i would like do rezolve is to start the curl without waiting to download the first page, then the second and so on, it should start the download of all the pages at almoust the same time. That would be all, please help, i`ve lost 4 nights trying to get this fixed but no luck
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.