madrazel Posted June 3, 2007 Share Posted June 3, 2007 i need something that can download whole web-page with images, styles, frames and save it all to one folder just like a web browser would do Quote Link to comment https://forums.phpfreaks.com/topic/54100-script-to-download-whole-website/ Share on other sites More sharing options...
penguin0 Posted June 3, 2007 Share Posted June 3, 2007 Do you want a script? All I know how to do that with is FTP. Quote Link to comment https://forums.phpfreaks.com/topic/54100-script-to-download-whole-website/#findComment-267469 Share on other sites More sharing options...
madrazel Posted June 3, 2007 Author Share Posted June 3, 2007 i want script for automation, to write a bunch of url's to a specific folders (to save files to my local disk, not to a remote ftp location) Quote Link to comment https://forums.phpfreaks.com/topic/54100-script-to-download-whole-website/#findComment-267471 Share on other sites More sharing options...
kael.shipman Posted June 3, 2007 Share Posted June 3, 2007 Try loading each file into a string, then going through each element like this: <?php //foreach URL $file = file_get_contents($url); //images $offset = 0; while (($tag = @strpos($file,'<img src="',$offset)) !== false) { //You should probably use some other sort of matching here incase someone put something between "img" and "src" $attr = $tag + 10; //10 is the string length of '<img src="', so $attr points to the beginning of src $endAttr = strpos($file,'"',$attr); //End is the position of the first quote after the start of the attribute $img = substr($file,$attr,($endAttr - $attr)); //Check for absolute vs relative url here and insert appropriate prefix (i.e., "http://www.example.ex" if url is relative to site root) $img = imagecreatefromjpeg($img); //Use GD to check file type and use appropriate function (jpeg is obviously just an example) to load the image into memory imagejpeg($img,'path/to/newFile'); //save image imagedestroy($img); //free memory $offset = $endAttr; //set offset so it keeps going forward in the file. } //anchors .... etc. ?> You could make that a function, too (albeit a complicated one), and use it recursively to download the entire website by crawling through the links and sending each page up to the function. That might be kind of messy, though. By the way, this is just an example. It's only to impart the concept; not to actually be used! For it to work, you'd have to include a lot more flexibility to account for people's sloppy code. Good luck! -kael Quote Link to comment https://forums.phpfreaks.com/topic/54100-script-to-download-whole-website/#findComment-267486 Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.