A way to do the impossible? Would this work?
Posted 05 October 2006 - 05:32 PM
A while back I posted a thread about trying to use CURL to fetch several seperate pages at the same time, instead of waiting 3 seconds for each to load, in a line.
I was assured this was impossible - but couldn't this be achieved using the following?
1) Instead of loading a script, I load a HTML page with 8 seperate frames/iframes/whatever. This triggers 8 *different* scripts on my server simultaneously.
3) In theory, I would then have all the 8 pages looked up in 3 seconds (whatever the longest page takes) instead of 3 seconds per page?
I can't see why this wouldn't work, but I am highly inexperienced compared to you guys. The thing is, given PHP's wonderful level of sophistication - I simply can't believe that the most efficient way to do this is to use tacky, inefficient client-side scripting, and then use a HTML form as a buffer. Can't this be done entirely server side somehow?
Posted 05 October 2006 - 05:44 PM
If you are using a *nix system you could use pcntl_fork.
Now if this is on a windows system then you may want to have a look at this tutorial http://www.phpfreaks...orials/71/0.php If you go this root then you may have to send the results to a db or write a text file.
Posted 05 October 2006 - 07:36 PM
Posted 05 October 2006 - 11:51 PM
Thanks both of you - seems like very good answers. Both over my head, but very good answers!
I've been doing a lot of research on this, but finding very little - I could find absolutely NO documentation or help explaining pcntl_fork (obviously the manual pages, but no examples or anything).
Curl_multi_exec() function again seemed promising, but little to no documentation - and what I did find seemed to suggest (although I'm not certain) that this was only for multiple pages from the same site.
I discovered something called ares and c-ares libraries which also might seem to be able to help me with this problem - but they seem EXTREMELY advanced (extensions to PHP itself... *gulp*) and I think they might even involve utilising C++ apps in PHP. Again the lack of documentation is stifling to the point where I couldn't even find a basic description.
If anyone's got any more advice I'd love to hear it, in particular any experiences with any of the above or further infos.
Print - You're a legend, lol. I'd really like to see a simple example for using sockets or what have-ya to retrieve pages simultaneously from multiple sources - if that's what these can do?
Posted 06 October 2006 - 12:01 AM
Posted 07 October 2006 - 01:22 AM
Posted 09 October 2006 - 03:17 PM
I didn't forget you, I've just been busy...
Anyway I will PM you a download link sometime tonight when I have a chance to write some real world examples...
Here is example of just fetching (3) pages
The array used in the example...
$file = array ( array ( 'url' => 'http://www.msn.com' ), array ( 'url' => 'http://www.adobe.com' ), array ( 'url' => 'http://www.google.com' ) );
The reason why I am using an array for each request is so that you can supply different request information for each request! Like...
array ( 'url' => 'https://www.site.com' 'method' => 'POST', 'browser' => 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)' );
If browser is not supplied it default to mozilla, if method is not supplied it defaults to GET, the url tell the class how to make the request, (use SSL (https) or don't use SSL (http)), if the scheme is not supplied, it defaults to http://!
So my example array, lists (msn, adobe, google) in that order, but when the array is sent to the class, the class returns each request in the order it is completed, so the quickest response will be returned first. All of them are monitored at the same time!
// this uses HTTP 1.0 (no chunk or gz handling)
// this uses HTTP 1.1 (with chunk and gz handling)
It will be depended on your system which will work better!
Posted 09 October 2006 - 05:15 PM
Guess there are some really genuinely helpful people around.
Very much forwards to the PM.
It occurs to me: I've spent a looooooong time looking, and I couldn't find a way to do this before you released this. You might want to stick it on SourceForge? Solves a real need.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users