eddcaton 0 Posted January 5 Share Posted January 5 Hey All, I am using a Curl script to call item details from an API and then updating mysql with the results. But the API is limited to 100 requests per 60 seconds. So I am currently just using: $page = $_GET["page"]; $next_page = ++$page; echo "<meta http-equiv='refresh' content='60;url=?page=".$next_page."'>"; This runs until the results from the API runs out, then resets the page number to 1 and loops through again. My question is, will this would the same with a CRON job? Quote Link to post Share on other sites
requinix 958 Posted January 5 Share Posted January 5 No. That <meta> tells a browser to request the page. cronjobs don't use browsers. This doesn't smell right. Why are you hitting the API so often? Why does it have to "loop through again" when it reaches the end? Quote Link to post Share on other sites
eddcaton 0 Posted January 5 Author Share Posted January 5 To be honest it could loop through every week or so. Basically i use a web based rental system to keep track of hires and inventory which i am using to update a wordpress website with the hire products details so that customers can see it from the website. The hire products can have their details updated every few weeks, by multiple users who inherently don't tell me if they update things, hence an automatic update. I have the update process working through a wordpress template which i am just manually accessing at the moment. Quote Link to post Share on other sites
requinix 958 Posted January 5 Share Posted January 5 There are different approaches for this sort of problem, ranging from on-demand cached API calls to background cronjobs. How important is it that your data is up to date? Ideally everything would be updated immediately, naturally, but since you can't do that, how old can your data be before it becomes unacceptably old? Quote Link to post Share on other sites
eddcaton 0 Posted January 5 Author Share Posted January 5 I would think that we can have automated updates weekly. Then if i do any large changes i can run the code manually. Quote Link to post Share on other sites
requinix 958 Posted January 6 Share Posted January 6 Oh, weekly? That's quite a while to be out of date, though: that's potentially seven whole days for something to be inaccurate. You can probably update daily. Yeah, go ahead and switch to a cronjob script. That single script will go through all of the pages, beginning to end, grabbing the data it needs. After every page, slow down by sleep()ing for at 1 second - that being a number more than the 60/100 = 0.6 seconds per request bottleneck that you have to stay above. Then have the script run daily. Like, midnight. Or some other non-peak time. Quote Link to post Share on other sites
eddcaton 0 Posted January 7 Author Share Posted January 7 Brilliant, Thanks for the advice. But how do i go about turning my script that works with meta refresh, to one that will work with cronjobs? Quote Link to post Share on other sites
gw1500se 61 Posted January 7 Share Posted January 7 (edited) Eliminate the meta refresh. When you create a cronjob you tell it how often to run the script. This tutorial tells you how to set up cron. Edited January 7 by gw1500se Quote Link to post Share on other sites
gw1500se 61 Posted January 7 Share Posted January 7 I forgot to mention, not HTML is needed since no browser is involved. Simply write the script as if it were any other scripting language (bsh, Python, perl, etc.). Quote Link to post Share on other sites
eddcaton 0 Posted January 8 Author Share Posted January 8 Yes, i have the code working, but am unsure as to how i loop through the API request pages? Quote Link to post Share on other sites
gw1500se 61 Posted January 8 Share Posted January 8 Post your code. Be sure to use the code icon (<>) on the menu and specify PHP. Quote Link to post Share on other sites
eddcaton 0 Posted January 8 Author Share Posted January 8 global $wpdb; //// Look at custom current plugin to change below//////// global $wp_query; if (isset($wp_query->query_vars['page_no'])) { print $wp_query->query_vars['page_no']; } ///////////////////////////////////////////////////////// ///////////////////// DO API REQUEST ///////////////////// $per_page = 10; $page = $wp_query->query_vars['page_no']; $url = 'https://api.com/api/v1/products?page='.$page.'&per_page='.$per_page.'q[allowed_stock_type_eq]=1'; $ch = curl_init($url); $customHeaders = array( 'X-SUBDOMAIN:domain', 'X-AUTH-TOKEN:Token' ); curl_setopt($ch, CURLOPT_HTTPHEADER, $customHeaders); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); $result = curl_exec($ch); curl_close($ch); $phpObj = json_decode($result, true); if (!empty($phpObj)) { foreach($phpObj as $x=>$x_value) { foreach ($x_value as $y=> $y_value) { if ($y_value['allowed_stock_type']=='1') { $product_search_id = $y_value['id']; $product_name = $y_value['name']; $product_description = $y_value['description']; $product_image = $y_value['icon']['url']; $product_group = $y_value['product_group']['name']; $product_type = $y_value['allowed_stock_type_name']; $product_weight = $y_value['weight']; $product_price = $y_value['rental_rate']['price']; ///////////////////// SEE IF PRODUCT ID IS IN THE POST META TABLE ///////////////////// $post_exists = $wpdb->get_var("SELECT post_id FROM $wpdb->postmeta WHERE meta_key = 'product_id' AND meta_value = '".$product_search_id."'"); if ($post_exists) { $queried_post = get_post($post_exists); $db_product_name = $queried_post->post_title; $db_product_description = $queried_post->post_content; if(!$y_value['active']) { wp_delete_post( $post_exists); echo("Removing Post"); echo($product_name); } else { $post_meta = get_post_meta( $post_exists ); // array of all meta fields if($product_name != $db_product_name) /// Name { $my_post = array( 'ID' => $post_exists, 'post_title' => $product_name); wp_update_post( $my_post ); } if($product_description != $db_product_description) /// Description { $my_post = array( 'ID' => $post_exists, 'post_content' => $product_description); wp_update_post( $my_post ); } if($product_image != $post_meta['_imageUrl'][0]) /// Image { update_post_meta($post_exists,'_imageUrl',$product_image); } if($product_weight != $post_meta['product_weight'][0]) /// Weight { update_post_meta($post_exists,'product_weight',$product_weight); } if($product_price != $post_meta['product_price'][0]) /// Price { update_post_meta($post_exists,'product_price',$product_price); } $terms = get_the_terms($post_exists,'item_category'); // See if product category matches foreach ( $terms as $term ) { if ($product_group != html_entity_decode($term->name)) { wp_set_object_terms($post_exists, $product_group, 'item_category'); } } if($product_image != $post_meta['_imageUrl'][0]) { update_post_meta($post_exists,'_imageUrl', $product_image); Generate_Featured_Image($product_image, $post_exists); //update_post_meta(get_the_ID(),'product_img_upload','yes'); } echo ("<br />"); echo "Tile : ".$db_product_name.""; echo ("<br />"); echo "Product ID : ".$post_meta['product_id'][0].""; echo ("<br />"); echo "Weight : ".$post_meta['product_weight'][0].""; echo ("<br />"); echo "Price : ".$post_meta['product_price'][0].""; echo ("<br />"); echo "Image URL : ".$post_meta['_imageUrl'][0].""; echo ("<br />"); echo "Description : ".$db_product_description.""; echo ("<br />"); echo "Total Data : ".$post_meta['product_total_data'][0].""; echo ("<br />"); } } else { if(!$y_value['active']) { //Add new post $my_post = array( 'post_title' => wp_strip_all_tags($product_name), 'post_content' => '', 'post_status' => 'publish', 'post_author' => 1, 'post_type' => 'item', ); $post_id=wp_insert_post( $my_post ); if($post_id) { update_post_meta($post_id, 'product_id', $product_search_id); update_post_meta($post_id, 'product_weight', $product_weight); update_post_meta($post_id, 'product_price', $product_price); update_post_meta($post_id, 'product_total_data', ''); update_post_meta( $post_id, '_imageUrl',$product_img); } update_option('current_exists_value', get_option( 'current_exists_value' ).','.$product_search_id); } } } } } $next_page = ++$page; echo "<meta http-equiv='refresh' content='10;url=?page_no=".$next_page."'>"; /////////////////////////////////////////////////////////////////////////////////// } Quote Link to post Share on other sites
Solution requinix 958 Posted January 8 Solution Share Posted January 8 You'll probably have to make some other minor changes so that this script can run from a cronjob, but Essentially the change you need to make is to take all of that code, which works for a single page of data, and stick it inside of a loop that goes through all of the pages. Starts at page 1 (or perhaps 0), runs the code that you have now but edited to use that page number, then the loop moves to the next page and starts again until eventually there are no more pages. Quote Link to post Share on other sites
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.