pro_se Posted January 21, 2007 Share Posted January 21, 2007 hello, is there a function that could take all the text on a external page or url and save it to a flat text file or mysql table? Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/ Share on other sites More sharing options...
Antony the Awesome Posted January 21, 2007 Share Posted January 21, 2007 You might want to look at file_get_contents(). Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/#findComment-165446 Share on other sites More sharing options...
bibby Posted January 21, 2007 Share Posted January 21, 2007 Here's a script that'll do better.It'll output all the html as it is, AND hook you up with a local hard copy.[code]<?//set these yourself//$newFilename='copy.html';$url= 'http://dev.mastercontrolprogram.org';if($f=file_get_contents($url)){ $loc=$_SERVER['DOCUMENT_ROOT'].'/'.$newFilename; if( ($copy=fopen($loc,'w')) != FALSE) { fwrite($copy,$f); echo $f; fclose($copy); $message= "I think i got it" . $loc; } else $message= "couldnt open file";}else $message= "i dont know about ".$url; echo "<!-- " .$message." -->";?>[/code] Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/#findComment-165482 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.