dadhich Posted July 28, 2007 Share Posted July 28, 2007 hi, I am using fwrite() function to fetch web page contents, but it is not taking more then 8K data at a time. I am using PHP 5. Following is my code for getting page contents. $hosts = array("example1.com","example2.com"); foreach ($hosts as $id => $host) { $s = stream_socket_client("$host:80", $errno, $errstr, $timeout, STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT); if ($s) { $sockets[$id] = $s; $status[$id] = "in progress"; } else { $status[$id] = "failed, $errno $errstr"; } } while (count($sockets)) { $read = $write = $sockets; $n = stream_select($read, $write, $e = null, $timeout); if ($n > 0) { foreach ($read as $r) { $id = array_search($r, $sockets); $data = fread($r,8194); if (strlen($data) == 0) { if ($status[$id] == "in progress") { $status[$id] = "failed to connect"; } fclose($r); unset($sockets[$id]); } else { $status[$id] .= $data; } } here i have opened "n" no. of sockets for "n" no. of URL's , they are in array $sockets. I want to fetch page content of all the pages simultaneously. Thanks in advance. Link to comment https://forums.phpfreaks.com/topic/62157-limitation-of-fwrite-using-multiple-socket/ Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.