Vince889 Posted June 2, 2009 Share Posted June 2, 2009 I think I have an idea, but I just want to make sure. How do I download files from remote servers? Lets say I am running: http://host.com/download.php I want my download.php script to fetch "super.jpg" from http://anotherhost.com/super.jpg How would I do that? Would I have to use cURL or include()? Or some sort of fopen() function? Ex: <? include("http://anotherhost.com/super.jpg"); ?> Mind you, I want this file (super.jpg) SAVED to myself. I don't want to just display it, I want to save it. Pretty much just copying a file from a remote location to my server. Quote Link to comment Share on other sites More sharing options...
gevans Posted June 2, 2009 Share Posted June 2, 2009 When you say download, are you looking at offering a download on site a from the content of site b. Or do you just want to use the content (not download it) Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 I just want to use the content. "Mind you, I want this file (super.jpg) SAVED to myself. I don't want to just display it, I want to save it. Pretty much just copying a file from a remote location to my server." I want to copy super.jpg from the remote host and save it to my server. Quote Link to comment Share on other sites More sharing options...
thebadbad Posted June 2, 2009 Share Posted June 2, 2009 file_get_contents() and file_put_contents() should work: <?php //set user agent string ini_set('user_agent', 'Mozilla/5.0 (Windows; U; Windows NT 6.0; da; rv:1.9.0.10) Gecko/2009042316 Firefox/3.0.10'); $url = 'http://anotherhost.com/super.jpg'; file_put_contents('local/dir/' . basename($url), file_get_contents($url)); ?> Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 Hmmm, I had created something different with fopen. They both work. Thanks, thebadbad. =] Quote Link to comment Share on other sites More sharing options...
thebadbad Posted June 2, 2009 Share Posted June 2, 2009 No problem. In case you're wondering why I'm setting the user agent string, it's because some sites check it, and try to disregard (automated) requests from scripts etc. The user agent string sent along PHP requests is blank by default (I believe), else "PHP" or similar. Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 Hey thebadbad, quick question about your script. How would I make that loop through an entire directory? Like: http://anothersite.com/images/super.jpg http://anothersite.com/images/super2.jpg http://anothersite.com/images/super3.jpg http://anothersite.com/images/anotherdirectory/super.jpg http://anothersite.com/images/anotherdirectory/super2.jpg http://anothersite.com/images/anotherdirectory/super3.jpg http://anothersite.com/images/anotherdirectory/anotherdirectory/super.jpg http://anothersite.com/images/anotherdirectory/anotherdirectory/super2.jpg http://anothersite.com/images/anotherdirectory/anotherdirectory/super3.jpg and so on. Basically, I want to scrape everything from the /image/ directory. Sub-directories and all. Any idea how to do that? Quote Link to comment Share on other sites More sharing options...
thebadbad Posted June 2, 2009 Share Posted June 2, 2009 There's no way to read whole remote directories, only via e.g. FTP. But if you've got a list of URLs, it would be simple. Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 Yes, lets say I had a list of URLs.. Would I just duplicate this over and over, changing the file name each time? <?php //set user agent string ini_set('user_agent', 'Mozilla/5.0 (Windows; U; Windows NT 6.0; da; rv:1.9.0.10) Gecko/2009042316 Firefox/3.0.10'); $url = 'http://anotherhost.com/super.jpg'; file_put_contents('local/dir/' . basename($url), file_get_contents($url)); ?> Quote Link to comment Share on other sites More sharing options...
thebadbad Posted June 2, 2009 Share Posted June 2, 2009 No, you would use a loop Example: <?php //set user agent string ini_set('user_agent', 'Mozilla/5.0 (Windows; U; Windows NT 6.0; da; rv:1.9.0.10) Gecko/2009042316 Firefox/3.0.10'); $urls = array('http://example.com/file1.jpg', 'http://example.com/file2.jpg', 'http://example.com/file3.jpg', 'http://example.com/file4.jpg'); foreach ($urls as $url) { file_put_contents('local/dir/' . basename($url), file_get_contents($url)); } ?> But reading many files will require a lot of memory and execution time. You can set these values at the top of your script if your host allows it: //remove time limit set_time_limit(0); //set memory limit to 200 MB ini_set('memory_limit', '200M'); Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 Wait are you sure? That didn't work for me.. Quote Link to comment Share on other sites More sharing options...
thebadbad Posted June 2, 2009 Share Posted June 2, 2009 How didn't it work? What exact code did you try? Quote Link to comment Share on other sites More sharing options...
Vince889 Posted June 2, 2009 Author Share Posted June 2, 2009 Ah, I see my mistake. I got it now. =) Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.