brm5017 Posted January 10, 2009 Share Posted January 10, 2009 Ok, so I get a feed from an XML source which I've imported into my database. Within the feed, there is an image URL in the form of: http://trend.trendrets.com:6103/platinum/getmedia?ID=70054419106&LOOT=50038672093 I want to take this image url and save the image it's referring to it to a folder on my server. How do I go about doing this? Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/ Share on other sites More sharing options...
auro Posted January 10, 2009 Share Posted January 10, 2009 easy... use this friend. copy($URL, $LOCALFILENAME); Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734147 Share on other sites More sharing options...
brm5017 Posted January 10, 2009 Author Share Posted January 10, 2009 Forgot to mention, i've got 5 milion URL's in my database. Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734157 Share on other sites More sharing options...
auro Posted January 10, 2009 Share Posted January 10, 2009 Sure friend, that wont be any problem... you can loop through to fetch them all. You will have to divide it because PHP wont allow you execution for days. :-) good luck! Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734163 Share on other sites More sharing options...
corbin Posted January 10, 2009 Share Posted January 10, 2009 Errrrr..... Do you really want to do that? You'll probably want to store them on demand, not all at once. If you stored them all at once, it would take days (maybe weeks) to download all of them. Let's say 10KB an image.... 10KB * 5000000 = 50000000KB = 47.6837158203125GB. So...... Yeah. Anyway, you would just do what auro said, or do file_get_contents or just do it with raw headers. Which ever way you want to do it. Just of curiosity, why are you ripping all of these photos? Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734166 Share on other sites More sharing options...
PFMaBiSmAd Posted January 10, 2009 Share Posted January 10, 2009 Would you rather just use the images at those links on your web page or do you need to have a copy of them on your server? The following will work on your web pages - <img src='http://trend.trendrets.com:6103/platinum/getmedia?ID=70054419106&LOOT=50038672093' alt='your alt text here'> If you truly want to get a copy of all the images, what part of the coding to do that do you need help with? Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734171 Share on other sites More sharing options...
auro Posted January 10, 2009 Share Posted January 10, 2009 Corbin is absolutely right. Management of sooooooo many files would be very messy. Where exactly do you want to put all those images? Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734181 Share on other sites More sharing options...
brm5017 Posted January 10, 2009 Author Share Posted January 10, 2009 I'm using it for a Real Estate search function. it already takes like 4 seconds to query the database. I wonder if my database should be broken up into separate tables according to County name or City Name. Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734206 Share on other sites More sharing options...
brm5017 Posted January 10, 2009 Author Share Posted January 10, 2009 I'd like to put them on my server. I'm going to re-think this database design then post what i've come up with. Right now, I've got 2 tables Listings: (address, city, state, price, beds, bathrooms, etc.(there's 35 fields)) Listing_Pictures: (Picture URL(from the feed), PictureOrderNumber, ListingID) Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734223 Share on other sites More sharing options...
DeanWhitehouse Posted January 10, 2009 Share Posted January 10, 2009 The database won't need to be broken up, just your queries might need changing. Google database optimisation and normalisation. Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-734227 Share on other sites More sharing options...
brm5017 Posted January 13, 2009 Author Share Posted January 13, 2009 There's a lot of redundant data in my URL's. The only parts that change are the ID numbers and the LOOT numbers. http://trend.trendrets.com:6103/platinum/getmedia?ID=70054419106&LOOT=50038672093 If I just extract the ID and LOOT from the url and store them in the database, then assign them to separate columns, this would greatly speed up the search, right? Link to comment https://forums.phpfreaks.com/topic/140304-copy-image-from-url-to-my-server-5-million-times/#findComment-736170 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.