br0ken Posted May 10, 2009 Share Posted May 10, 2009 I'm writing a small spider like application that requests a lot of pages. I know there is a lot of different ways to do this and I was wondering which way would be the most efficient. The methods I know are fopen, fsockopen and file_get_contents. Also, what's the difference between fopen and fsockopen? Thanks! Link to comment https://forums.phpfreaks.com/topic/157577-efficiency-http-page-requests/ Share on other sites More sharing options...
gffg4574fghsDSGDGKJYM Posted May 10, 2009 Share Posted May 10, 2009 The most efficients way will probably be multi thread curl http://www.google.com/search?q=multi+thread+php+curl cUrl offer more options that fopen,fsockopen and file_get_contents and with multi thread you can download many pages at once. the difference between fsockopen and fopen is fsockopen open a connection over internet to grab the file (URL), fopen open the file locally. With the options "allow_url_fopen = On" in php.ini it allow fopen to open file locally or open a connection over internet like fsockopen if the file is a URL. Link to comment https://forums.phpfreaks.com/topic/157577-efficiency-http-page-requests/#findComment-830955 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.