tonton Posted March 9, 2016 Share Posted March 9, 2016 Hi, I have a PHP script using the cURL function. This script send the data by a web service. It works often without problem. Sometimes, he sent a double data : same data by two times. I checked the log file this script (by curl_getinfo) on "total_time" When this script works correctly (it send the data only one time) : the total_time is less than 6 second. but when the total_time is more than 7 second, (this is a casual/occasional situation), my script send same data by two times so it is double data! So the web service send the ACK(Acknowledgment) which takes a long time (more than 7 second) and then, the TCP send again same data to a web service. How can I avoid this doubling ? By increasing this time waiting / time limit ? How can I define this time limit for 12 second? Could you help me : Thanks Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted March 9, 2016 Share Posted March 9, 2016 The real problem is actually something different: Your service is unable to recognize duplicate submissions (those can happen for all kinds of reasons, not just TCP issues). So the first thing you should do is introduce a nonce (number used only once) to your request and make your service reject all requests which have a duplicate nonce. For example: Generate a sufficiently long random ID for each request and send it along with the payload. On the server, create a table for all used nonces. A single nonce column declared as the primary key is sufficient. Whenever you receive a request, you try to insert the nonce into the above mentioned table. If that fails due to a duplicate value, you cancel the processing. Otherwise you continue. 1 Quote Link to comment Share on other sites More sharing options...
tonton Posted March 9, 2016 Author Share Posted March 9, 2016 (edited) Hi Jacques1, many thanks The real problem is actually something different: Your service is unable to recognize duplicate submissions (those can happen for all kinds of reasons, not just TCP issues). When you say "your service", is it web service which I use, isn't it ? So this web service provided us by our supplier. So I can not any control this web service : I must ask to change to our provider ? Or I must add something in my script ? So the first thing you should do is introduce a nonce (number used only once) to your request and make your service reject all requests which have a duplicate nonce. What is a nonce ? It means a token ? Where can I put this token ? I can create a unique/single token with "microtime" + order number, is it right ? Before "curl_close" line 84? Here is my code : include 'includes/info.php'; include 'includes/functions.php'; $micro_date = microtime(); $date_array = explode(" ",$micro_date); $nowDate = date("Y_m_d__H_i_s__",$date_array[1]); define('KEY_SECRET', $key_shared_secret); function verify_webHk($dataIn, $hmac_header) { $calculated_hmac = base64_encode(hash_hmac('sha256', $dataIn, KEY_SECRET, true)); return ($hmac_header == $calculated_hmac); } $hmac_header = $_SERVER['HTTP_X_HMAC_SHA256']; $dataIn; $dataIn = file_get_contents('php://input'); $verified = verify_webHk($dataIn, $hmac_header); $dataInDecode = json_decode($dataIn); $dataInDecodeArray = json_decode($dataIn, true); $nowDateFile = date("Ymd"); $orderNumber = $dataInDecodeArray['orderNumberJsn']; $logFileName = '\sent_response_' . $orderNumber . '_' . $nowDateFile. '.txt'; $logFileNameConnexion = '\error_connexion_' . $orderNumber . '_' . $nowDateFile. '.txt'; $orderNumber = $dataInDecode->orderNumberJsn; foreach ($dataInDecode->lineItems as $objProd) { switch($objProd->trader) { case 'TraderPeekABoo' : $errorNumberCurl = NULL; $errorTitleCurl = NULL; $searchID = funcFind(array("*-","--","=="),$objProd->nsm); $productID = $searchID[0]; $priceItems = $objProd->priceItems; $pricesTaxs = (array)$objProd->taxLines; $tax1st = !empty($pricesTaxs[0]) ? $pricesTaxs[0]->priceItems : 0.00; $tax2nd = !empty($pricesTaxs[1]) ? $pricesTaxs[1]->priceItems : 0.00; $total = $priceItems + $tax1st + $tax2nd ; $commandXML = '<?xml version="1.0" encoding="utf-8"?> <command> <traderId>' . $orderNumber . '</traderId>'; $commandXML .= '<commandItem> <faceValue>' . number_format($total, 2, '.', '')*100 . '</faceValue> '; $commandXML .= '</commandItem></command>'; $urlWS = $url . $productID. '?account=' . $account . '&key=' . $key ; $curl = curl_init(); curl_setopt($curl, CURLOPT_URL, $urlWS); curl_setopt($curl, CURLOPT_POST, true); curl_setopt($curl, CURLOPT_SSL_VERIFYHOST, false); curl_setopt($curl, CURLOPT_SSL_VERIFYPEER, false); curl_setopt($curl, CURLOPT_POSTFIELDS, http_build_query(array('xml' => $commandXML))); curl_setopt($curl, CURLOPT_RETURNTRANSFER, true); $dataOut = curl_exec($curl); $infoOut = curl_getinfo($curl); $logData = "\r\n" .'Information transfer :'. "\r\n" . serialize($infoOut) . "\r\n" .'result :'."\n". $dataOut. "\r\n"; if(curl_errno($curl)) { $errorNumberCurl = curl_errno($curl); $errorTitleCurl = curl_error($curl); } curl_close($curl); if($errorNumberCurl) { $dataLogError = "\r\n" .'CURL\'S Error number: "' . $errorNumberCurl . '" and Error info: "'. $errorTitleCurl. '"'."\r\n"; file_put_contents($logDirectory.$logFileNameConnexion, $dataLogError, FILE_APPEND | LOCK_EX); } file_put_contents($logDirectory.$logFileName, $logData, FILE_APPEND | LOCK_EX); break; default: // echo '<p>NOTHING</p>'; break; } } exit(); Edited March 9, 2016 by tonton Quote Link to comment Share on other sites More sharing options...
tonton Posted March 10, 2016 Author Share Posted March 10, 2016 can you some idea for me ? Quote Link to comment Share on other sites More sharing options...
Jacques1 Posted March 10, 2016 Share Posted March 10, 2016 Double-check the service documentation (or ask the provider) to see if they support any kind of unique transaction ID or nonce to prevent duplicate submissions. Then I wonder how the workflow looks like. Your own script also seems to be a webservice. Are you sure that it's only called exactly once? Finally: You said that the service fails to send any ACK segments. Did you actually see that in a packet sniffer, or is this just speculation? Quote Link to comment Share on other sites More sharing options...
tonton Posted March 11, 2016 Author Share Posted March 11, 2016 Hi Jacques, thanks for this. Finally: You said that the service fails to send any ACK segments. Did you actually see that in a packet sniffer, or is this just speculation? This is just speculation. Because I think that the cUrl doesn't retry automatically but the TCP protocol does : if TCP not getting its ACK so it re-sends the packet a couple of times, does not it? And this situation is very rare/exceptionnel 1%! When the "total_time" (by [curl_getinfo]) is more than 7 second. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.