Anidazen Posted September 17, 2006 Share Posted September 17, 2006 Hello.I've got a script that involves parsing 10 seperate websites. Some of these have quite slow servers at times. That's a loading time of 20 seconds odd for my script, almost entirely from these sites.Is there any way at all to CURL all 10 at once, or as many similtaneously as possible, to minimise this waiting? Or is that one of the few web-scripting tasks that is truly beyond the reach of PHP?Please advise! Link to comment https://forums.phpfreaks.com/topic/21023-parsing-many-sites-similtaneously/ Share on other sites More sharing options...
tomfmason Posted September 17, 2006 Share Posted September 17, 2006 I would recommend doing the processing of the sites in the background. Here is a tutorial http://www.phpfreaks.com/tutorials/71/0.php. This says that it is forking but it realy isn't. It is realy just a tutorial on how to use an exec function to run a process in the background.Hope this helps,Tom Link to comment https://forums.phpfreaks.com/topic/21023-parsing-many-sites-similtaneously/#findComment-93312 Share on other sites More sharing options...
Anidazen Posted September 17, 2006 Author Share Posted September 17, 2006 I've heard about something called CURL_multi, but I haven't been able to find a single piece of relevant documentation or a tutorial or anything. Anyone able to shed some light? (I couldn't even find some sample source to work it out myself from!) Link to comment https://forums.phpfreaks.com/topic/21023-parsing-many-sites-similtaneously/#findComment-93440 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.