Jump to content


Photo

Parsing many sites similtaneously.


  • Please log in to reply
2 replies to this topic

#1 Anidazen

Anidazen
  • Members
  • PipPipPip
  • Advanced Member
  • 79 posts

Posted 17 September 2006 - 12:20 AM

Hello.

I've got a script that involves parsing 10 seperate websites. Some of these have quite slow servers at times. That's a loading time of 20 seconds odd for my script, almost entirely from these sites.

Is there any way at all to CURL all 10 at once, or as many similtaneously as possible, to minimise this waiting? Or is that one of the few web-scripting tasks that is truly beyond the reach of PHP?


Please advise!

#2 tomfmason

tomfmason
  • Staff Alumni
  • Advanced Member
  • 1,696 posts
  • Locationstealing your wifi

Posted 17 September 2006 - 12:50 AM

I would recommend doing the processing of the sites in the background. Here is a tutorial http://www.phpfreaks...rials/71/0.php. This says that it is forking but it realy isn't. It is realy just a tutorial on how to use an exec function to run a process in the background.

Hope this helps,
Tom

Traveling East in search of instruction, and West to propagate the knowledge I have had gained.

current projects: pokersource

My Blog | My Pastebin | PHP Validation class | Backtrack linux


#3 Anidazen

Anidazen
  • Members
  • PipPipPip
  • Advanced Member
  • 79 posts

Posted 17 September 2006 - 10:17 AM

I've heard about something called CURL_multi, but I haven't been able to find a single piece of relevant documentation or a tutorial or anything. Anyone able to shed some light? (I couldn't even find some sample source to work it out myself from!)




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users