Jump to content

API Integration help

Recommended Posts

Hi guys,

I am having a little trouble getting my head around how to tackle an issue. I am working with an API :-


Connecting up to it no problem and getting the information via cURL is also no bother. The point is I am needing to loop through galleries to retrieve 400 gallery IDs so I can pass in the data I need so for example I execute my first cURL response which gives me all the gallery IDs from here :-


Now then I need to run a foreach loop to get each gallery ID to pass to :-


Now the problem is I am having to use cURL to get the JSON data from here too and then finally the last one :-


To get the gallery featured image. Now the first cURL works fine but when I introduce the others in foreach loops it cripples it on loading time, but looking through of course it would as its excuting cURL a ridiculous amount of times.

Is there a better way of tackling an issue like this as I feel cURL and file_get_contents is the wrong way going about this. Normally it's one execution and I get all the data I have never worked with something like this before. I would show my code but I feel it is unnecessary at this point on the grounds of I do not think this is the most productive way of even looking at this

Currently the only thing I can think of is creating a query string to execute via a CRON job that would execute twice a day to check through the galleries API at Photoshelter and add everything to a database that I need.

Any information would be greatly appreciated

Thanks in advance

Share this post

Link to post
Share on other sites

So you're doing 400+ API calls on this page? It's going to be slow - no way around that. Even if each call took only 100ms (which includes sending the request, the server processing data, and then it sending the response back), that's 40 seconds to do all of them.


Why do you need so many galleries' data at once?

Share this post

Link to post
Share on other sites

For things like this I would typically run a script with cron and cache the data results.

Most likely doing each of those 400 you said individually.

The cron script would be connecting a timely manner looking for any changes and overwrite the cache file.

Your script uses the local cached data.

Share this post

Link to post
Share on other sites

Thanks for the input guys.


Requinix - Yes this is why I thought it was a silly way of looking at how to do this. Basically my client wants a website pulling in all his photo shelter data, so I would typically run through and pull in all the galleries load the gallery title and key image. Then link each one to its own area via a GET request to get the gallery ID and pass in the images of each inner gallery. However I think this is my downfall of always using PHP when it comes to working on a project.


QuickOldCar - Thanks for this I think this is the way to look at it either this way or via Ajax to load the data asynchronously. I can run one cURL request to get every gallery ID which is fast and pass it into a data attribute for each <li>. This would give me what I need to then pull in each <li>'s data via Ajax to append data.


Thanks for the input guys I think not working on something of this scale before had me stumped as I am no pro and needed help of actually solving the issue before I even started.


Greatly appreciated.

Many Thanks


Share this post

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.