pro_se Posted January 21, 2007 Share Posted January 21, 2007 hello, is there a function that could take all the text on a external page or url and save it to a flat text file or mysql table? Quote Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/ Share on other sites More sharing options...
Antony the Awesome Posted January 21, 2007 Share Posted January 21, 2007 You might want to look at file_get_contents(). Quote Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/#findComment-165446 Share on other sites More sharing options...
bibby Posted January 21, 2007 Share Posted January 21, 2007 Here's a script that'll do better.It'll output all the html as it is, AND hook you up with a local hard copy.[code]<?//set these yourself//$newFilename='copy.html';$url= 'http://dev.mastercontrolprogram.org';if($f=file_get_contents($url)){ $loc=$_SERVER['DOCUMENT_ROOT'].'/'.$newFilename; if( ($copy=fopen($loc,'w')) != FALSE) { fwrite($copy,$f); echo $f; fclose($copy); $message= "I think i got it" . $loc; } else $message= "couldnt open file";}else $message= "i dont know about ".$url; echo "<!-- " .$message." -->";?>[/code] Quote Link to comment https://forums.phpfreaks.com/topic/35058-save-a-webpage/#findComment-165482 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.