PatchGill Posted February 15, 2016 Share Posted February 15, 2016 Hi guys,I am having a little trouble getting my head around how to tackle an issue. I am working with an API :-https://www.photoshelter.com/developer/index/endpoints/publicConnecting up to it no problem and getting the information via cURL is also no bother. The point is I am needing to loop through galleries to retrieve 400 gallery IDs so I can pass in the data I need so for example I execute my first cURL response which gives me all the gallery IDs from here :-/psapi/v3/collection/root/childrenNow then I need to run a foreach loop to get each gallery ID to pass to :-/psapi/v3/gallery/{gallery_id}/psapi/v3/gallery/{gallery_id}/key_imageNow the problem is I am having to use cURL to get the JSON data from here too and then finally the last one :-/psapi/v3/gallery/{gallery_id}/images/{image_id}To get the gallery featured image. Now the first cURL works fine but when I introduce the others in foreach loops it cripples it on loading time, but looking through of course it would as its excuting cURL a ridiculous amount of times.Is there a better way of tackling an issue like this as I feel cURL and file_get_contents is the wrong way going about this. Normally it's one execution and I get all the data I have never worked with something like this before. I would show my code but I feel it is unnecessary at this point on the grounds of I do not think this is the most productive way of even looking at thisCurrently the only thing I can think of is creating a query string to execute via a CRON job that would execute twice a day to check through the galleries API at Photoshelter and add everything to a database that I need.Any information would be greatly appreciatedThanks in advanceJamie Quote Link to comment Share on other sites More sharing options...
requinix Posted February 16, 2016 Share Posted February 16, 2016 So you're doing 400+ API calls on this page? It's going to be slow - no way around that. Even if each call took only 100ms (which includes sending the request, the server processing data, and then it sending the response back), that's 40 seconds to do all of them. Why do you need so many galleries' data at once? Quote Link to comment Share on other sites More sharing options...
Solution QuickOldCar Posted February 16, 2016 Solution Share Posted February 16, 2016 For things like this I would typically run a script with cron and cache the data results. Most likely doing each of those 400 you said individually. The cron script would be connecting a timely manner looking for any changes and overwrite the cache file. Your script uses the local cached data. Quote Link to comment Share on other sites More sharing options...
PatchGill Posted February 16, 2016 Author Share Posted February 16, 2016 Thanks for the input guys. Requinix - Yes this is why I thought it was a silly way of looking at how to do this. Basically my client wants a website pulling in all his photo shelter data, so I would typically run through and pull in all the galleries load the gallery title and key image. Then link each one to its own area via a GET request to get the gallery ID and pass in the images of each inner gallery. However I think this is my downfall of always using PHP when it comes to working on a project. QuickOldCar - Thanks for this I think this is the way to look at it either this way or via Ajax to load the data asynchronously. I can run one cURL request to get every gallery ID which is fast and pass it into a data attribute for each <li>. This would give me what I need to then pull in each <li>'s data via Ajax to append data. Thanks for the input guys I think not working on something of this scale before had me stumped as I am no pro and needed help of actually solving the issue before I even started. Greatly appreciated. Many Thanks Jamie Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.