stevesimo Posted September 5, 2008 Share Posted September 5, 2008 Hi, I am attempting to write a script which downloads a copy of specified auction listings from a well know auction site. It downloads a copy of the html page and associated images using the item number of the listing. Although the code which I am using works, it is slow as it has to recreate the images once they are downloaded. Also I have noticed that the images once recreated appear to be slightly larger than the original ones on the auction site. I tried to use CURL but I think that the auction site have disabled this functionality on their server as all I kept getting was a message saying that the file had moved to another location although the underlying link I was using was correct. Here is my code: $im = @imagecreatefromjpeg($imageurl); imagejpeg($im,$filename); Any suggestions would be much appreciated. thanks, Steve (Blackpool, UK) Link to comment https://forums.phpfreaks.com/topic/122851-what-is-the-most-efficient-way-to-copy-image-from-external-site/ Share on other sites More sharing options...
JonnoTheDev Posted September 5, 2008 Share Posted September 5, 2008 try WGET Link to comment https://forums.phpfreaks.com/topic/122851-what-is-the-most-efficient-way-to-copy-image-from-external-site/#findComment-634483 Share on other sites More sharing options...
discomatt Posted September 5, 2008 Share Posted September 5, 2008 Why not just read the file using file_get_contents or fopen/fread then use fwrite or file_put_contents(PHP5) to store them locally. This will leave them unchanged bit for bit. Link to comment https://forums.phpfreaks.com/topic/122851-what-is-the-most-efficient-way-to-copy-image-from-external-site/#findComment-634633 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.